I have a problem with R Snippet, my table contains 2 columns and ~850000 rows.
Error is : knime error could not detach connection to R could leak objects to other workspace
I think that can be problem because volumetry of the table, I try with sample of the table and It work.
Changing the Rserve receiving buffer size limit in Preferences -> KNIME -> R. It’s set to 256 MB by default, but you can set it to zero to remove the limit.
In R CMD INSTALL Warning in strptime(xx, f, tz = tz) :
_ unable to identify current timezone ‘C’:_ please set environment variable ‘TZ’ Warning in untar2(tarfile, files, list, exdir, restore_times) :
_ failed to copy ‘Rserve/src/client’ to ‘Rserve/clients’_ * installing source package ‘Rserve’ … Warning in system(“sh ./configure.win”) : ‘sh’ not found ERROR: configuration failed for package ‘Rserve’ * removing ‘E:/R-3.5.1/library/Rserve’ * restoring previous ‘E:/R-3.5.1/library/Rserve’
The downloaded source packages are in
_ ‘C:\Users\zeineb.aljene\AppData\Local\Temp\RtmpyUy2AB\downloaded_packages’_ Warning message: In install.packages(“Rserve”, , “http://rforge.net/”, type = “source”) :
_ installation of package ‘Rserve’ had non-zero exit status _
You could indeed try to set the Timezones. I think I remember it helpd me at certain points.
Another option could be to try ro install the package via RStudio. Sometimes that also helped.
Knime version: 3.6
R version: 3.5.1 and Updated R Serve
In Knime preferences I’ve set the buffer size limit to cero
I am attaching an image of the process
I have seen such errors on servers where the same R package was used by different users at the same time (you could check that and make sure the current knime installation ist the only one using this R folder, at least for the time of workflow execution). And you could try to run your R jobs one after the other not parallel.
I produced the same error message
“Execute failed: Could not detach connection to R, could leak objects to other workspaces.”
MRO 3.5.0, 3.5.3 and ordinary R 3.5.0, cache was set to infinity, KNIME version 3.7.1
processing a String dataframe with high dimensions.
(~41 thousand rows 34 columns). A simple R snippet doing almost nothing, just passing
my df to the output failed.
I imputed missings, which worked one time but not reproducible - sorry.
I then chopped the size of the data with row filtering and set keep all in memory). This leads to various results (sometimes errors sometimes not).
Lowering the row numbers increases the chance that no error occurs.
BUT the behavior is not reproducible. Some memory leak I guess