Execute failed: The request has been throttled

Hi All,

I’m having a flow which sends the output (csv writer) to a sharepoint.
The output has 9 million rows and around 30 columns (so not small, but not unmanageable either).

It used to run ok till the last run that had 7M rows, but today it throws the error
“Execute failed: The request has been throttled” in the csv writer node.

Has anyone faced that before? I believe it has to do with the sharepoint connection as the writer worked ok when I changed the location from sharepoint to my desktop.

I couldn’t find anything online explaining this error.

Thanks in advance

Hi @siskos_k

Welcome to the KNIME Forum. Most cloud apps have limits on how much data you can send at one time. It will probably let you continue writing data again after some time, but you’re being throttled by Sharepoint:

Regards,
Wali Khan

5 Likes

@siskos_k you could write three CSV in chunks and even add a wait node in order not to stress the SharePoint. You could do it in chunks

CSV allows to append data:

2 Likes

Thanks @wkhan for your response and the provided information!
Indeed after some time (don’t remember how many days - wasn’t many though) it allowed me to write data to the sharepoint again. Truth is I had sent a lot of datasets to the sharepoint because I was testing something, so probably this was the reason I got this error message.

Thanks @mlauber71 for your suggestion!
I did it with the chunk loops way. but the microsoft authentication token expired as it was still taking a lot of time to write all the data (even in chunks) to the sharepoint.
However what I did and worked well, was that I put the Microsoft authentication node within the loop so it gets updated every time a chunk dataset is written to the sharepoint and therefore it doesn’t take to long to expire.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.