Out of heap errors reading excel files with 16GB RAM

Hi

I’m running Knime on a 16GB RAM Intel I7 machine.

I have a sizeable workflow.
knime stops and comes up with out of heap space errors. I understand that knime keeps objects in memory which is what allows me to run each node at a time. This is with 8GB set off in the ini file for knime.

I’ve managed to work around the issue by using flow variables to force knime to read the excel files one at a time.

Is there a better way to do this like a generic option which stops knime from trying to read several excel files at the same time and running out of heap space.

Also is there a global option or a parameter I can set to make knime not keep objects in memory when I just want to run the the workflow in production and don’t need to debug each step but prefer knime to use less memory?

Any adivice or experience would be much appreciated.

Thanks

You can tell each node to store its results on the disc that could save RAM

Managing the flow of data with flow variables is a good idea.

The you could check other tips to increase performance.

The another idea could be to increase the RAM allocated to KNIME to 10 or 12 GB and not have other big programs running.

Then you might give us a better idea how your workflow looks like and what it does so we might see how to improve things.

1 Like

Hi there @srimalj,

What KNIME version are you using?

Does this impacts your time execution a lot? When you say flow variables you mean flow variables connections (the red ones), right?

Br,
Ivan

1 Like

KNIME Analytics Platform 4.0.2

Release date: October 1, 2019

Well its way faster than when knime tries to read in all the excel files at the same time, and runs out of heap :slight_smile:

1 Like

Hi there @srimalj,

Then it seems you found a solution :slight_smile:

Additionally to learn a bit more on memory and KNIME check out these two topics:

Br,
Ivan

1 Like

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.