R Script - large table fails.

I'm getting the error: "Execute failed: Could not detach connection to R".

The R-Snippet is just doing rank correlation (on a quite large, 50 x 20k ) table.

n.b. I found the built-in rank correlation node very slow.

The same script & data worked fine in R-Studio.

(A less demanding small 50 x 50 table, worked fine)

Is there a size limitation for R Integration? (the output table would be 400 million elements, ~ 2.7 GB)

Cheers,

Steve.

Hi Steve,

KNIME uses the Rserve package to interface with R. By the default, the buffer size limit is set to 256 MB. If you go to File -> Preferences -> KNIME -> R, you can set this value to 0 to disable this limitation. This should help.

Cheers,

Roland

I had already changed the R buffer size limit to '0'.

BTW, my (win10) PC has 64GB, with Java -Xmx32g in knime.ini, I'm not hitting the physical memory limit according to taskmanager.

 

Hi SOH979,

The memory limit for R is independent of the memory limit specified for the Java Runtime of the KNIME Analytics Platform. In general there should be no limit there, though. 

You can try using the most recent version of KNIME which has some options to improve performance for sending and receiving the data. Most importantly, we send the data in chunks rather than one huge chunk. That may help especially with cases like yours where the one huge chunk possibly hits some limit within Rserve which causes it to crash.

Please tell me if it works with that.

Kind regards, Jonathan.

I have not noticed this issue after the latest update, though I'm not sure whether this was due to you optimizations or simply because I re-installed from scratch. [Just using 'update failed to work].

Cheers,

Steve.

1 Like