I have a text file (with an extension “.dat”, though it is 100% human-readable) that is not importing correctly using the ‘CSV Reader’ node. In particular if I import a 4-line file, it becomes a 2-line table; if I use head to take only the first two lines and import those, I end up with a 1-line table. This is with “column headers” turned off in CSV Reader. This suggests to me that the end line characters might be strange in this file (trying to read it in Windows). However if I look at it in Cygwin it looks like (I see endline character “$” when I use ‘cat -vET’). I can also open this file just fine in Excel and see the correct number of lines.
What is a good way to handle this?
thank you
First, forget of CSV Reader and try File Reader node. It has more configuration options.
Thank you for the suggestion! I spent more time with the file and realized one of the big stumbling blocks is that one of the column headers starts with “#”, which CSV Reader expects to be used to denote a comment. CSV Reader allows me to cancel that out though; I don’t see a way to do that with File Reader. The clue was in the error message telling me about a line being too short - it expects that a comment will break a line.
You could try and use the R package readr
This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.