This may not be a KNIME specific question but I have a really simple workflow to help local audit teams which involves an Excel reader node reading many General Ledger files from a folder and then a CSV writer node which is a one General Ledger file output.
When a team goes to open the CSV General Ledger an error is displayed in Excel saying that there is too much data so cuts it short (i.e. at the 1m line Excel threshold)… I thought we’d avoid this issue though as exporting the data in CSV. We use other files that are in CSV which we open with no issues.
I know you can use Power Query to open it or ever KNIME to view / pivot the data but feels as though I’m missing something obvious.
Does anyone have any ideas of how to fix this within KNIME or is it an Excel issue?
Excel have a limited rows to read about 1million and 47 thousand lines at all…
CSV and TXT don’t have this limitation, normally used as common file type to change between databases and programs dataviz.
If you need to open it on excel, first at all, you open the excel file, go to “Data” tab and select import from a file. It’ll bring you a window to select the CSV or other kind of files.
It´ll open the power query tool to manipulate it and view “all” the information as a new table.
If you split your csv and several files, the same options can help you, but you will select a folder with all the files inside with the same kind (CSV), and structure.