I’m working with large data sets, but after 100,000 rows, no more data is produced and passed on down the line for processing. I’ve tried modifying the knime.ini file with
-Dorg.knime.container.cellsinmemory=10000000
to increase the amount of rows saved in memory, a tip from the blog., but this still results in 100,000 rows being the maximum amount of data. I estimate I need 5-to-6 times this amount for my work. The only other option I see is chunking, but I see this as a last resort method.
When the data amount is not as large, yes. Using another SQL query program I can retrieve all data, but this is external to KNIME and I can’t automate the data extraction with said program.