Conda Environment Propagation with team plan execution resource

Hi, I was trying to use a custom python environment in an execution resource I created using my team plan. The FAQ of the team plan says the below. I understand the python environment will have to be created on each execution, the question is how do I create this custom environments in my execution resource. Can I change the “pythonpath” so the conda propagation node recognizes a cloud location?

FAQ - Team Plan:

Can I use the Python integration? Will there be conflicts with Python virtual environments or conda environments I use in the KNIME Analytics Platform?

Yes, the Python integration is available. The only Python environment available by default on the KNIME Community Hub is the bundled Python environment of the Python integration. If you need a custom environment, use the Conda Environment Propagation node to make sure it gets set up when your workflow runs. Please note that Python environments configured by the Conda Environment Propagation node will be created each time the execution context starts up.

1 Like

Hey there, welcome to the forum :-).

Can you share a screenshot of the current set up? In general I think it should work the same compared to sharing a workflow that has a conda environment propagation node in it. I.e. before you upload you select the properly set up environment in the node (1), connect the variable port to your python script node and set the environment of the python script node via flow variable :

1 / 2:

3

I think what the documentation says is that, compared to using conda environment propagation node locally, where the env is set up when the node is executed first and is then “available” permanently, on KNIME HUB it will not be “stored permanently” but creating the env happens each time the workflow is executed. That obviously can slow done things depending on the environment…

1 Like

Thank you for your assistance. However, the Conda propagation node only recognizes the environment installed in the execution resource. Although I can connect an Azure Blob containing the Python environments to the workflow, the Conda propagation node still does not detect them unless I change the “pythonpath” of the workspace. Are there any options for doing this?

Hi @rcortesb,

what @MartinDDDD was implying is that you need to set up the conda environment for your Python script locally, meaning with a KNIME Analytics Platform installed on your machine. The Conda Environment Propagation node will extract a list of all packages installed in your environment locally (use the “Include explicitly installed only” button for cross-plattform compatibility). Then, when you upload the workflow and schedule it to run in your CHub team plan, it will create a new conda environment on the CHub executor from this package list. Maybe this blog post can shed some light into how to use the node?

Does that clarify how it’s supposed to work?

Best,
Carsten

PS: I am not quite sure how you wanted to use the Azure Blob here to help… but if you tried to mount a folder containing a conda environment then this doesn’t need the Conda Environment Propagation node.

4 Likes

Thanks, Carsten. I didn’t know you could install the Python libraries using the Conda Environment Propagation node.
I have a different question. After the workflow execution finished, I noticed that the Execution Context didn’t automatically stop after more than 10 minutes of inactivity. Is there an option to define the inactivity time for the Execution Context to shut down using a KNIME team plan? Alternatively, is there a node that can shut down the Execution Context?

All good with the automatic shutdown. I found out if you keep the “Inspect” windows open the execution context keeps running. It takes about ~1 minute to automatically shutdown after inactivity.

Thanks!

Hi @rcortesb,

As you noticed, the Execution Context automatically shuts down if it is not used anymore for over 1 minute. An Execution Context is still in use if you have an “Inspect” window open.

Did you have an “Inspect” window open? If the workflow just runs and finishes, the Execution Context should be stopped after ~1 minute.

Best,
Simon

4 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.