I'm currently creating a few KNIME nodes with the GenericKnimeNodes (https://github.com/genericworkflownodes/GenericKnimeNodes) functionality. So far, all was working well, but now one node fails with a "Java Heap Space Error" during execution.
The node's functionality is pretty simple: It reads a .sam/.bam file (the output of another node which sorts the reads in the file first), processes it and writes a .csv file. If I execute the program which I want to turn into a node on its own with a small test dataset (15 MB sam file), it doesn't use more than 20MB RAM and the output file also is just about 1.5 MB or so.
I don't use any KNIME tables to pass data around but only file input/output ports (if that is what you call them, I'm not entirely sure about the terminology). I guess I could try to increase the Heap Space, but I can't imagine that's actually the issue here since the external payload that gets executed runs in C++ and as I mentioned, does not use much memory. Changing the memory policy also does not help (of course, I don't use any KNIME tables that might be stored in memory). When I manually execute the program (the C++ payload) on the data of the previous step that KNIME stores in /tmp everything also executes perfectly. It just does not work in Node form.
Is there anything I have to consider or might have forgotten?
Thanks for the help.