Google Sheets: Payload exceeds limit of 10485760 bytes

Good morning,

I’m facing an issue with the Google Sheet Updater node which is pretty self explaining but I wonder how to circumvent it without separating rows into chunks, looping over them whilst the first chunk must clear and upon the second append data … basically speaking over engineering the upload.

ERROR Google Sheets Updater 0:601      Execute failed: 400 Bad Request
  "code" : 400,
  "errors" : [ {
    "domain" : "global",
    "message" : "Request payload size exceeds the limit: 10485760 bytes.",
    "reason" : "badRequest"
  } ],
  "message" : "Request payload size exceeds the limit: 10485760 bytes.",
  "status" : "INVALID_ARGUMENT"

Thanks a lot

1 Like

Hi Mike

I managed to reproduce the problem. There seems to be no straight forward way to solve this at the moment, the Google Sheets API does not handle this well.

For the time being I created a workflow which creates a new spreadsheet/sheet and writes/appends the sheet chunk-wise, as you proposed: google_sheet_payload.knwf (38.3 KB) I exposed most of the settings in the wrapped metanode dialog, you might have to change the chunk size to fit your table.

I hope this helps you overcome this limitation for now.


Hi, we had this similar issue, and tried the above solution. However, we’re having problems integrating it into our workflow.

All we want to do, is chunk our table into smaller pieces and then upload into Google Sheets (into one larger sheet) to get around the payload size limitation.

Any help on this would be greatly appreciated.

Happy to elaborate if that helps.

Do you identify via API key or via regular auth? I lost track of this issue but continue to use Google Docs w/o any issues since an extended period of time.

Ah, we are using oAuth - if we switch, will that allow larger packet sizes then?

I have tried both, but get the same error:

ERROR Google Sheets Writer 2:75 Execute failed: 400 Bad Request
“code” : 400,
“errors” : [ {
“domain” : “global”,
“message” : “Request payload size exceeds the limit: 10485760 bytes.”,
“reason” : “badRequest”
} ],
“message” : “Request payload size exceeds the limit: 10485760 bytes.”,

Hi @SteveO

Did you try adjusting the number of rows per chunk? If you have a very wide table that might fix your problem. It can be adjusted in the configuration of the Chunk Loop Start node.


Thanks Oole, that seems to be the fix!

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.