Optimization needed List Files, Python Source, DB Writer

Hmm difficult to judge from the screenshots. One idea could be to first collect all of your data locally and then load it into the DB in one single step.

Also if you have 13 parallel sessions accessing the DB from one account it may well be that the configuration would slow that down (only accepting a certain number of connections at one point) - another question would be the connection. How is the DB you are trying to write to connected to your system.

Then you mention you write to a MySQL DB. Is that a remote one or one on your machine. And what configurations are there.

In general you should think about your setup and reduce the number of DB operations to a minimum. You could see if certain calculations in KNIME can be done in memory - depending on your systems RAM.

You would have to identify which nodes cost the most time in oder to do something about it.

And here we have a list of discussions and blog entries about KNIME an performance. You might want to take a look if something strikes you as an idea to explore further.


KNIME performance

Process 900+ CSV files

3 Likes