slow performance DB Reader

Hi,

I am experiencing a slow DB Reader node. The SQL query when executed with SQL developer takes less than a second, but the same query in KNIME takes > 15 minutes. Something is wrong. Can someone help me? I tried changing the fetchsize from 1000 to 100000, but that doesn’t change the result. I have also read on forum that the .metadata folder’s content might be the issue.

Thanks, Frederik

Hi there @fstouten,

welcome to KNIME Community!

Can you share more information like what KNIME version are you using? What database are you talking about? What driver did you register?

Br,
Ivan

The difference most likely is that in SQL developer only the first 50-200 rows are fetched (how many depends on your settings but I think 50 is default). If you scroll down in the resulst, more rows are fetched and the timer goes up.

In knime that query will fetch all data and depending on the number of rows and their content, it can take a while (especially LOB columns).

As a test you can limit your query to say 10’000 rows or such and then in SQL developer run it as a script instead of a query and then it will also fetch all rows. Then compare to run time of same query in KNIME. it will most likely be similar.

5 Likes

Thanks, that explains it of course.

But in general, I feel its lagging in performance and its not always. sometimes it gets stuck for even 10K records…

Hi there @nathmalp,

welcome to KNIME Community!

You are talking about DB Reader node, I assume? What database are you using? Considering speed varies might be that you are experiencing network issues?

Br,
Ivan

1 Like

As ipazin said. network latency also known as “lag” or “ping” will have a huge effect on load time. When working at home and essentially connecting to the database over the internet (100ms ping vs 0.01 on LAN) easily increases load time of same data 20-fold. Yes, the impact is that big.

Thanks for the response. Am using SQL server 2017 Enterprise edition and the DB is within the network. I did try using SQL Statement and got the result in no time. but the DBReader takes time. I tried to see if i my machine hadware is causing this delay but i use 64 GB RAM computer and its not even touching 12-16 GB. CPU seems to be normal. I am still hunting for a clue.

Hello, Thanks. You are right, I am referring to DB reader on SQL Server 2017.

Hi there @nathmalp,

ok. So you are within same network and experiencing slow DB Reader performance sometimes. Did I get it right?

Can you tell me what KNIME version are you using and what driver?
Does this happen for every table or only some specific?
And what do you mean be slow - How much does it take to read 10k rows?
Do you maybe have aggressive antivirus program?

Br,
Ivan

Hello @ipazin,

Yes. I am on the same network as that of database and it happens for few tables which is new finding after i post my first message. The internal table where it is causing trouble have few indexes created and it runs fast on SQL console if i replicate the logic on SQL Query.

The table which i am trying to read contains 8 hundred thoursands record and causing performance issues. However, i am able to read few tables wtih 13 hundred thousands record without much delay.

Therefore, I am not sure what is causing delay, The anti virus shouldnt be the issue i believe, given this case. Still experimenting…
I use KNIME 4.1.2 and

Rows doesn’t mean anything if the table with less rows has some large column containing an image or long text or other LOB type data. Can you check in SQL server how large a table is?

Agreed with you however i my case, none of the data columns contains any data type like LOB or image or any special teatment. All columns either have bigint, varchar or numeric with precision.

Total number of rows are 7403582 and size in MB:5950.
I am on intel i7 series with 64 GB RAM so system configuration shouldnt be the concern.

Is there anything else, i should be worry about on this.

Hi there @nathmalp,

so we are talking about 7 million rows. And how long does it take (or it fails?) to read that data into KNIME? Also have you tried increasing fetch size parameter in connector node?

Additionally here is link documentation regarding knime.ini setup which can help you utilize better your available memory:
https://docs.knime.com/2019-12/analytics_platform_workbench_guide/index.html#setting-up-knime.ini

Br,
Ivan