Execute Python/R Integration nodes in Google Cloud

I’m a total noob about the inner workings of the cloud compute platforms. I’m just beginning to read articles on this. I’m curious if the case below is possible.

I have workflows that implement a typical ML pipeline: Load-and-prep data, train-test split, train model, test model, etc.

It’s a mixture of built-in KNIME nodes and Python/R integration nodes (learners, predictors, views, etc.)

Would it be possible for the Python/R integration nodes be executed in the Google Cloud Platform? And the results of those nodes returned to the workflow to continue?

(Hopefully the workarounds to make this work, be not that complex that I have to redo my workflows all over again) (>___<)

Thanks!

Hi @codvknime,
Unfortunately there is not one knob you can turn to execute the Python parts of your workflow in the cloud. The Python and R integration nodes can only communicate with a locally running Python or R. Depending on the data size it might also not be very beneficial to run the code in the cloud, as all the data had to be sent there and then retrieved back once the computation is finished. And that for every Python/R node in your workflow. What you could do is this: deploy your Python code as Google Cloud Function and call it via POST Request node (HTTP Functions). I doubt that this will be very cost-effective, though.
Kind regards,
Alexander

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.