I’m facing an issue with the Google Sheet Updater node which is pretty self explaining but I wonder how to circumvent it without separating rows into chunks, looping over them whilst the first chunk must clear and upon the second append data … basically speaking over engineering the upload.
I managed to reproduce the problem. There seems to be no straight forward way to solve this at the moment, the Google Sheets API does not handle this well.
For the time being I created a workflow which creates a new spreadsheet/sheet and writes/appends the sheet chunk-wise, as you proposed: google_sheet_payload.knwf (38.3 KB) I exposed most of the settings in the wrapped metanode dialog, you might have to change the chunk size to fit your table.
I hope this helps you overcome this limitation for now.
Hi, we had this similar issue, and tried the above solution. However, we’re having problems integrating it into our workflow.
All we want to do, is chunk our table into smaller pieces and then upload into Google Sheets (into one larger sheet) to get around the payload size limitation.
Do you identify via API key or via regular auth? I lost track of this issue but continue to use Google Docs w/o any issues since an extended period of time.
Did you try adjusting the number of rows per chunk? If you have a very wide table that might fix your problem. It can be adjusted in the configuration of the Chunk Loop Start node.