Memory - is there a way to release the memory after the execution of nodes

I run an InChI to sdf conversion. The only way this works is to split the original file in several smaller files. The option "Write table to disk" seem not work with the nodes "Molecule to Indigo" and "InChI to Rdkit". I set the size of the files so nearly all the 8GB of the RAM are used in one run. I need to close KNIME and open it again to free the memory. Is there a better way to do this?

Regards,
Alex

Hi Alex,

That doesn't sound right. Do you see this high memory consumption by looking at the process table (Linux 'top' or Windows task manager?) or through java/KNIME? Java doesn't really release memory and return it to the OS. Any external tool you use to see the memory consumption may be misleading. I suggest you look at the heap status bar shown when you select "File" -> "Preferences" -> "General" -> "Show Heap Status". Make sure you run the garbage collector to get a more realistic view on the current memory consumption (press the trash bin icon in the heap status bar at the bottom of the KNIME window).

Does it still say 7+ GBs of memory? If it does then we should get a copy of the workflow/data and talk to the contributors of the nodes in question (Indigo and RDKit).

Regards,
  Bernd

 

Dear Alex,

 

I would like to try to reproduce your issue. How many InChi Codes are you using in your input table?

 

Kind regards,

Manuel