I have a use case in which I am creating a workflow to be used for a weekly refresh. When I am running it for the first time, a CSV file is created and stored on Amazon S3 as (Eg-“Data.csv”)
When the same workflow is executed again, the new file gets appended to the existing CSV and overwrite the existing file on S3(‘Data.csv’)
Now, when I am trying to read the new appended file again it is throwing me an error on KNIME Server.
ERROR CSV Reader 3:178:0:281:0:279 Execute failed: The data row has too many data elements
-When I am running the same workflow on my local it is running fine but giving me an error on KNIME Server
-I have checked the file on Excel too it is not corrupt
Kindly let me know how to resolve this issue