Real Concurrency instead of Batch-Concurrency

Hi,

currently, it is already possible to run loops or Get/Post/… requests with some kind of parallel processing / batch-like concurrency.
Unfortunately, there is no real concurrency style loop or processing available yet.

Examples:
if you use the GET request node with 10 rows and have a concurrency of 5 set, it will only start processing rows 6-10 once ALL first five rows have been processed. The term concurrency is very misleading here as it is practically just batch processing and not concurrency.
Instead, if it would be proper concurrency, the node would start processing the next row directly, trying to process 5 rows in total at the same time.

In a similar manner, we have a lot of different loops available, including Chunk loop to process rows in batches or the Parallel Chunk Loop that just splits the dataset into equal-sized parts and processes those in parallel.

In both cases, for web requests but also for general processing, real concurrency can be beneficial if you have a mix out of long and short running tasks.
Also, when interacting with a 3rd system, having additional possibilities to set up concurrency is beneficial to prevent e.g. heavy load on those system by firing too many requests at once.

From technical point of view, Java should support this just fine with virtual threads. The only other alternative right now is building a custom solution, e.g. a component relying on a database (in memory or static), with a job queue, row-lock, etc. - nothing trivial either.

tldr: a loop offering chunk processing but with real concurrency (and not batch processing) would be a great addition

Hi @janhdt and welcome to the forum.

I’ve moved your post to the Feedback & Ideas category so others can vote on it. :slight_smile:

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.