Trying to read a .txt file with delimiter from database extract

I am trying to read a .txt file with @^@ as the delimiter. The most success I get is from using the file reader (complex format) node but once it gets to a particular row it has an error. It seems to relate to a column that contains a lots of data that goes over quite a lines within the text file. It appears between the delimeters but the amount of data seems to confuse the reader. Is there another setting that I need to apply?
image

@Page0727 you import the file as one large column with the Fixed Width File Reader – KNIME Community Hub. Then you replace your separator with a separator like ASCII 164 (¤). Then you export it as a CSV with blank as column separator. And then you import it back into KNIME with CSV reader and the new separator.

Also you could try to then allow short lines.

2 Likes

I came across where the error is and managed to get it to work using python script.
The problem it seems is essentially as you suggested there are some fields that have multiple lines of data with each one of those starting with a \t. Is there anyway to incorporate this in the file reader node?

Thank you for your response and apologies for not getting back to you sooner.

1 Like