R Statistic and R Script are running extremely slow in loops

Hi all,


I am using KNIME 2.3.3 (latest) with the latest R Snippet node. I was using the R snippet to connect to my local R install (v2.12.1) I was using the snippet in a counting loop start and an end loop. The loop was set to loop 20 times and the R snippet has a single line of code like the following:

R <- "whatever (any text)";


I have some other nodes after the R snippet node; without the R node, the loop can be finished in couple of seconds; but with the R node, each iteration will be up to at least 2 to 3 seconds to finish. SO the total time to finish this 20 iterations will be 40 to 60 seconds. Am I the only user with this slowness issue with R? Or is it normal? I don't think my PC is the problem. I have used another PC with the same issue. I am using win7 32bits, 2.5 G ram DDR2, 1.8ghz X2 AMD.


Is there any way to speed this up?


Thanks in advance.

I don't have that problem. I've worked with loops and R snippets multiple times, but never had that problem.

Also, don't spam your problems on this forum please. If you post it once, that is surely enough.

As far as I know the node runs to 50 %, all what you wait for afterwards happens in R. How long does it take of you just run:


or try to run your algorithms with less data! 2,5 G ram, hm could be more =D

You may try a new R installation? How is the performance if you run R directly without KNIME?

You said up to 60 seconds? ^^ well some of my workflows run for hours! is your purpose it so time critical?


Hi kmelis,

Thanks for the reply. I recognized the R section wasn't active, so I was thinking may be users from other sections could have some idea about this issue. Sorry about that again.
Do you mind sharing the version of KNIME and R you are using?


Hi AI,
Thanks for the reply. I had some huge workflow like yours that runs for hours and I recognized that the R snippets are the bottle neck of my workflow. The R snippets always starts with 48% and then "think" for 2-3 seconds (I can see it from the progress bar right below the R snippet icon.) So in my workflow, the R snippet will cost me 2-3 seconds per iteration and thats why my workflow takes hours to run.
I had implemented the simple test that you mentioned, but I just passed a simple string to the R (R <- "string") and that will cause the 2-3 seconds slowness in each iteration on the R snippet node.
As for the data size, when I was testing the R snippet, I only used a very small data, a 20 rows by 1 column matrix. So that should ruled out the data size being too big.
I have implemented the workflow in R itself and it runs much much quicker. Thats another reason why I was thinking may be it is a bug?
For the ram size, I have never seen it used up in my task manager, mine is not really a very memory intense type of workflow, at least for the tiny test I was running. So I am sure 2.5 g for my workflow is plenty enough.
I am wondering do you mind posting your version of KNIME and R so may be I can try those version combination?
Also, do you mind creating a simple counting loop and have a R snippet in the loop and see if the R snippet will halt the loop for 2-3 seconds in each iteration? As I understand, your R snippet (with the single line R script) currently with your setting should take less than a second to finish in each iteration, correct? (but mine takes 2-3 seconds.) I even upgraded my PC from a single core to a dual and the same issue remain. I also tried on a Pentium Dual Core PC with the same versions with the same issue.
The bottleneck of R snippet made my workflow to runs to at least 2 to 3 times longer than it should be is very annoying.
Any suggestion?

sry no time right now to create these example workflows.

maybe later...

Knime has to connect to R as a seperate "Programm" lying in the KNIME plugins directory.That means the R code is interpretet in R not in KNIME and then sent back to KNIME as a .csv file (I think so)...

That takes its time.

As I use local outlier detection in R it takes a few minutes to calculate the outliers for a huge amount of data.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.