I have a data set of about 11 million rows and 13 columns…
I raised my used ram to 4gb on my macbook which has a total of 8gb ( 2,6 GHz Intel Core i5, 8 GB 1600 MHz DDR3)
after i read in the csv-file, i normalized and row splitted it. But after all, When I want to use and Rprop MLP Learner, it crashes.
I know next to nothing about Knime’s internals, but I don’t think everything is in Java. So, instead of increasing the Java heap space, you could try lowering it as non-Java code will not use the Java heap space and may run out of memory because the heap space is so large.
Also, you can try to make the training data smaller with the partitioner node and test at what size it stops crashing. Maybe you don’t need all rows for a good model.
Hi there @JoeyMain,
welcome to KNIME Community!
Have you made some progress with it?
When you say crashes what do you exactly mean? KNIME application crashes or just freezes and you have to kill it or throws error in Console like “Java heap space” or something else?
This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.