Oracle connector and DB Query reader issue


I am experiencing a very severe performance drop when reading a table with x millions of rows data from oracle db.
When I had only a few 100 thousands rows, the reading time was OK, but now with an extended dataset it seems hanging.
I also tried to load only top 10/100 rows but it cannot even load anything just “executing”.
I even tried to do time based filtering because I was sure that it can load data within a timeframe (only few 100 thousands) but now, just hanging.
Could you please help, what can i try out?

Hi @andraskriston and welcome to the Knime Community.

This might be due to some limitations on the resources that you are using. Try increasing the memory allocation to Knime:

  1. Open the knime.ini file
  2. The line for memory allocation starts with “-Xmx”. For example:
    That would be for setting the memory allocation to 16GB.

Well, you are not just loading the data, you are doing filtering. Can you trying running the same queries from something other than Knime, but on the same station where your Knime is installed, to compare the performance? This is just to verify if the issue is with Knime, or simply with your set up (the db server, network speed, etc)

1 Like

In my case the oracle connection sometimes simply “craps out” if you play around and especially!!! when you cancel a DB Query Reader. The solution is to reset the connector node and then try again.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.