writing out large data into a temporary file

Hello !

As I handle with very large data (at least I'll have over 1 500 000 Bitstrings, that have a length of 1024 respectively), there was the idea to write out some data into a temporary file and read it (preferably automatically) in again so adding it again to the KNIME workflow.

Is there any possibility to do so with any KNIME nodes?

Or is there generally the possibility to handle such large amount of data (without Java Heap Space and GC Overhead Errors)? What (except of changing the knime.ini) would you recommend to solve this problem?

thanks in advance !

Can you give details on where exactly you run into problems? KNIME does what you describe already automatically (i.e. writes larger pieces of data out to disc). I'm sure simple processing flows have no problems reading and processing that amount of data (even much larger data) -- if you then feed this through computationally expensive nodes, such as for instance full distance matrix calculation or hierarchical clustering methods, you may run into scaling problems. But then again, for those nodes it is an algorithmic challenge not an architecture problem...