"Java heap space error" when using the "Structure filter" node

I am using the Schrodinger nodes such as “Molecule-to-MAE” and “Structure filter” and have problem with large files. It gives me a “Execute failed: Java heap space”. Does anyone know how large files these nodes can handle? Or if there is any smart way to fix the problem?

Did you already try to increase your heap space?

Thank you Fabian for your quick response.

My heap space is set with -Xmx1024M and I don´t think I´m going to change it. What ought I think about if I decide to increase it?

I solved the problem by cutting up the larger files into smaller ones and running multiple Knime-workflows via the command line. 100 000 compounds in the incoming sdf-files seemed to work fine. Not unmanageable but there ought to be a smarter way of doing it? Preferably inside the Knime GUI.

Are you still using KNIME 2.0.1? There was a problem with the KNIME SDF Reader
which caused memory leaks when reading many molecules. The newest version
(2.0.3) or Schrödinger’s SDF reader fixes this problem.
If you are using 2.0.3, which SDF Reader are you using?


I have the same kind of problem with KNIME 2.0.3 The question remain, how large files are these nodes made to handle?

The Schrödinger SDF-reader seams to be able to handle files with more than 1 000 000 structures. I have problems with the “Molecule-to-MAE” node.

For nodes such as SD readers, molecular property calculators and converters, there should be no limitation concerning available main memory. They work on a record-by-record basis, which is easy to stream out to disk. More “complicated” nodes (many nodes in the “mining” category, e.g. decision tree learner) are different – they require sort of random access to the data.

There is a memory panel in KNIME. If you go into “File” -> “Preferences” -> “General” and then enable “Show Heap Status”, you should find a little memory bar on the bottom right of the KNIME window. It shows how much memory is currently used by the system + there is a trash bin icon, which runs a garbage collector. If I read a SD file with 1.2Mio structures (~3GB), it goes up to 300MB main memory consumption but as soon as I invoke the garbage collector it falls back to 20 MB (java calls the garbage collector by itself when memory gets low, no need for you to repeatedly press that button). You should see a similar behavior for the Schrödinger SD reader (which I don’t have) and the Molecule-to-MAE node. If not, you should contact people at Schrödinger (or post it here and we will make Schrödinger aware of your post).


This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.