knime assign failed, request status: data overflow incoming data too big

I am using KNIME 4.7.1
I have RServe running and set the Rserve receiving buffer size limit to 0
My RStudio and R are up to date (as of 4/10/2023)

I have some intricate validation rules to code and it is easier to do in code than KNIME. I build an R Snippet to encapsulate these rules.

But when I push a large data set (1,141,828 rows / 71 columns) into the R Snippet, it gets to 50% loaded and then fails with the warning:

Execute failed: Failed to transfer data to R
assign failed, request status: data overflow incoming data too big

Are there any other levers to help make the transfer from KNIME to R work?
If I re-wrote the logic from and R Snippet to a Python Snippet, would that help?

Thank you for any thoughts/help!

@RVC2023 one way to get data from and to R is to use Parquet files. You can use them in one go or you can use chunks which is especially useful if you have large data.

The large file can be split into smaller parts. If they have the same structure you could also write them in a loop and later import them back as one file:

image

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.