Hello everyone,
I am trying to read, view the records present in the table by DB query reader node as the table contains few million rows it is taking forever to load. Suggest me best practice to achieve this.
Hey there, welcome to the forum.
If you are using DB query reader node the query is executed in the DB - so the performance bottle neck is likely the DB itself that takes time to execute the query.
In the past I had use cases that initially used a DB on MS Azure - as the DB grew and the amounts of data queries returned grew larger, we opted to purchase additional Database Transaction Units, which simply spoken increases cloud resources and consequently make things faster.
Luckily at some stage we switched to a, from my laymans perspective, more performant solution (snowflake).
@VishalS one idea could be to load the data in chunks if you have an individual ID or something.
The other option could be to use streaming
Hi @MartinDDDD ,
Thanks for the feedback. I am trying to read the table data using DB query reader node as the table contains 1173850 records the node is keep on running would you mind suggesting me alternative solution.
Thanks @mlauber71 ,
As I am new to KNIME I am not able to read the data using DB query reader node from DB connector node which is oracle DB. The table have 1173850 rows.
@VishalS you could try and take a look at the examples given. If you want to read about knime and databases in general maybe start with the database guide
https://docs.knime.com/latest/db_extension_guide/index.html
And then I like to point you to my own article
I’m afraid that KNIME will not speed up your DB-performance.
To clarify: If the node keeps running this indicates that it is waiting for the DB to finish executing your query.
So suggesting a different node will not solve your problem.
Have you looked into monitoring what happens on the DB-side? I assume you will see a long-running query…
This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.