I’m working on a KNIME workflow that reads data from a PostgreSQL table, processes each row, generates a JSON payload, sends it via a POST Request, evaluates the response, and then updates the original row in the database using a DB Update node.
Currently, I’m using a Chunk Loop Start node to process the data in batches. However, I’m facing an issue where the next iteration begins before the DB Update has fully completed. This creates problems with record-level consistency and timing.
What I want to achieve:
I need the loop to work strictly sequentially, like this:
• One row is processed
• JSON is sent
• The response is checked
• The database is updated
• Only after all of that, the next row should be processed
Question:
What’s the proper way to build such a loop in KNIME?
Should I use Table Row to Variable Loop Start + Variable Loop End?
Or is there a better best practice for this case?
Any advice, pattern, or example workflow would be much appreciated!
Within the settings of DB update the is a parameter batch size (or similar) where you can define the chunk size. So if your table is larger I recommend to set it to a value like 10.000 or similar. It depends on your data set, the performance of the DB.