There are times when I have a very large file, ~1GB in size that requires me to bump up the heap size that KNIME uses. When I cannot increase the heap large enough I have to manually break up the file into parts that can each be processed separately and then combined.
From what I can tell the only loop construct that works with "chunked" data is data that has already been consumed by KNIME.
Essentially what I am looking for is a CSV Chunked Reader Loop and an XML Chunked Reader Loop.
They would work like the existing chunked loop node but get their content from a file instead of the normal input. Within the loop I would process each chunk which could be accumulated internally to be used at another part of the processing without having to read the entire file into memory.
Is there a node in KNIME that can read parts of a file at a time for processing?
I think that it would greatly enhance KNIME's ability to work with very large files if this was implemented.
Thanks,
Scott