The data row has too many data elements in CSV Reader on KNIME Server

Hi Team,

I have a use case in which I am creating a workflow to be used for a weekly refresh. When I am running it for the first time, a CSV file is created and stored on Amazon S3 as (Eg-“Data.csv”)
When the same workflow is executed again, the new file gets appended to the existing CSV and overwrite the existing file on S3(‘Data.csv’)

Now, when I am trying to read the new appended file again it is throwing me an error on KNIME Server.
ERROR CSV Reader 3:178:0:281:0:279 Execute failed: The data row has too many data elements

Please note:
-When I am running the same workflow on my local it is running fine but giving me an error on KNIME Server
-I have checked the file on Excel too it is not corrupt

Kindly let me know how to resolve this issue

Try the File Reader node. Resolved the problem for me.

Hi @rfeigel
I have to write the file in S3 but the File Reader doesn’t have any connection port for S3 so can’t use File Reader

Have you tried to check the support short data rows option? If the schema of your csv file is changing after appending it it mandatory to select the Support changing file schema option under advanced settings.

Hope this solves your problem.

Best
Mark

1 Like

Hi @Mark_Ortmann ,

Thanks for the solution. The CSV Reader is reading the file now but it is not in the right format. It is appending all the rows into a single row. For eg: It the file has 4 rows it appends them into single row creating repetitive columns. Kindly refer to the screenshot

But I want my output to look like this

Scenarios Modelling Technique Model Data File Error Function Validation
3 Random Forest Model_Data_File1 root mean squared error 68.17
2 Random Forest Model_Data_File1 root mean squared error 67.88
1 Random Forest Model_Data_File1 root mean squared error 68.17

Currently, it is merging the Row 2 and 3 scenarios value into the validation column of the Row 1 while all the other columns of Row 2 and 3 are appended into different columns in Row 1. Not sure why CSV reader is reading in this way on the KNIME server as it is perfectly working fine on my local. Kindly let me know the possible solution

Thanks,
Anushka

Just to make sure that I understood everything.

Your workflow runs without any problems locally but when executing it on the server it fails. Since I don’t know how you get your data deployed to the server this is my best guess.

You’re on Windows and you configured the CSV Reader using \r\n as row separator. Your server however runs on Linux where the line break character is \n. Does changing from \r\n to \n solve the issue when executing the workflow on your server?

We are aware of this issue and already added an option that ensures that you don’t run into this problem anymore, however it will not be available before 4.4.0. We’re very sorry.

Please let me know if I was able to solve the problem and happy KNIMing
Mark

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.