R Snipped/Predictor error:

Hello everyone,

I have a problem with R Nodes from time to time, I’d say about 10% of the time I run a R Snipped, Predictor or something, I get this error:

ERROR R Predictor 0:1846:1651 Execute failed: R evaluation failed.: “sink();sink(type=‘message’)
knime.output.ret<-c(paste(knime.stdout,collapse=’\n’), paste(knime.stderr,collapse=’\n’))

Unfortunately I have no idea what it means…does anyone of you maybe know?

Thank you in advance!
Best regards

On a side note: after deinstalling some extensions I got Knime 4.0 to not randomly crash anymore while executing some stuff…

These questions come to my mind:

  • which R nodes do you use? The ones from KNIME or from the community (they differ somewhat in the usage of RServe))
  • which version of Rserve do you use? *1). Version 1.8.6 or higher is recommended
  • do you have any strange formats or fields in your data you want to export and is the R object you want to get out a data frame (not a list or something)

And could you provide us with a minimal example that reproduces the error so we might have a look.

*1) you could check it in an R node

knime.out <- as.data.frame(installed.packages())

1 Like
  • I use the R Nodes from knime.

  • RServe Version 1.8-6 (and R 3.6.0)

  • I don’t have any strange data fields, and yes, I input knime knodes and output data frames.

The thing with the reproduction is not so easy…as I said, that error happens sometimes, most times it does not. It happens with R Predictors as well as R Snippeds and also in different workflows with different data.
If I get this error, I just execute the node again, then it usually works right away - there are a few different errors that behave exactly like that.

I just thought anyone might know why.


we are looking into this.

My best guess is that the RServe connection is close, although it is still required.

Is it true that you have more than 1 R node in your workflow? If so - are they executed in parallel or sequentially?


Hello Mark,

there are indeed a few R Snippeds/Predictors in the workflow, and when the error happened they were running in parallel.

Although my colleague has another error, maybe because of his slow laptop…he can’t execute some R Snippeds from my workflows, because they have to calculate a lot (no problems with software settings) - his solution is to wrap that R snipped node within a parallel chunk loop, which works.

I don’t really understand at all what the problems with R are. I admit, the dataframes I work with can have up to a few hundred millions entries, but still - its not that big, especially when the R snippeds are only used the perform some function on just one column.
But any kind of loop is waaaaay faster in R than in knime; also way easier to code.
I do not have an R Server available, its all done with a local installation of R - I am not sure yet how we will do that with our upcoming knime server though. Worst case I have to replace all R nodes if workflows can’t be executed with 100% success rate.

many thanks in advance!

I made a mistake, the R Predictors were actually running in parallel.
I changed it afterwards into sequential.


what do you mean by

they have to calculate a lot

the size of the input data or complexity of you script/algorithm?

If it’s the size of the input data maybe your colleague could try setting Rserve receiving buffer size limit to 0 (Preferences->KNIME->R). Maybe that helps? Your colleague could also check if they runs out of memory.

Regarding your execution problem. We are currently looking into it. Our current guess, which needs to be verified, is that the problem is about a racing condition. If this is true I’d guess that if you solely execute R nodes sequentially this problem should not occur (clearly this will increase the running time of your workflow). You can give it a try by connecting those R nodes via flow variable connections to enforce sequential execution.

Thanks for reporting the problem and I’ll get back to you once we narrowed its origin down. Please excuse the inconveniences


they have to calculate a lot

the size of the input data or complexity of you script/algorithm?

I mean the amount of input data.

If it’s the size of the input data maybe your colleague could try setting Rserve receiving buffer size limit to 0 (Preferences->KNIME->R). Maybe that helps?

Already done that, that’s what I meant with “no problems with software settings”.

also, whenever possible; I already put all R nodes in sequential with flow variable connections.

There are also some other errors regarding R nodes which happen sometimes - I will add them here later, when they happen again. Something about leaking out of workflow, can’t remember the exact wording.

Thank you for your support so far!


would be awesome if you could add all the other R problems your facing!



the problem you are describing in your first post should be fixed with the upcoming bugfix release! Parallel execution etc. should also not be problem :slight_smile:.



Hello Mark,

thank you very much, for the update and the upcoming fix!