Knime not responding when dealing with large amount of data


I’m a new Knime user. I am learning the Knime white paper Big data, Smart Energy, and Predictive Analytics (2013) under energy category, IoT section. The full data for this project is about 176 million rows. The workflow runs fine when using one sixth of the data. However, Knime stops responding when reading the full 176 million rows of data. The workflow is attached named prepare data

1-Prepare Data reduced reset.knwf (302.3 KB) .

Thus, I attempted to run one sixth of the data a time for six times, this worked. However, when I tried to concatenate the hourly data together, Knime did ok for reading and concatenate the data, but it stopped responding when writing the concatenated data to a csv file, workflow attached 1-2 concatenate_data.knwf (23.1 KB). This exact same workflow to concatenate data worked for a small amount of data. I wonder why Knime stops responding when dealing with large amounts of data. Is there something wrong with my Knime configuration or my laptop configuration? I’m using default Knime configuration. Thank you.


Hi @jackcao53,
Welcome to the KNIME Forum! Have you checked out this blog post? Increasing the amount of RAM KNIME is allowed to use may help you in your scenario. Another approach that might help is streaming.
Kind regards,


Thank you very much. I will have a try.


Hi Alexander,

I tried to increase the amount of RAM to 11GB, the workflow still gets stuck at a certain node (reference row filter in this case). The streaming function doesn’t work as there are loops inside. This workflow works fine for one sixth of the data. The original whitepaper was conducted using a laptop with 8 GB of RAM, I wonder how did the workflow run in that case. In addition, this is not specific to one laptop, I ran the workflow on two different laptops and end up getting stuck on the same node. Do you have other suggestions? Thank you.


This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.