To create an environment, you can follow the steps provided here: KNIME Python Integration Guide
Once you have your environment in place, you need to tell the executor where to find it. This can be done by adjusting the client profile (KNIME Server Administration Guide) of the executor. You would need to add the following information into the executor.epf:
/instance/org.knime.python2/condaDirectoryPath=<<path to conda installation>>
/instance/org.knime.python2/python2CondaEnvironmentDirectoryPath=<<path to conda Python 2 environment, something like a\b\c\anaconda3\envs\py2_knime>>
/instance/org.knime.python2/python3CondaEnvironmentDirectoryPath=<<path to conda Python 3 environment, something like a\b\c\anaconda3\envs\py3_knime>>
To check whether the executor successfully picked up these settings after a restart, you can check the combined-preferences.epf in executor-workspace.metadata.plugins\org.knime.product. If the settings are in there, the executor will make use of them and you should be able to run workflows with Python Snippet nodes. As a simple test case you can run a workflow containing an empty Python Source node.
For DeepLearning you should follow the steps here: KNIME Deep Learning Integration Installation Guide
Once you are done with it, you would need to add the following settings to executor.epf
/instance/org.knime.dl.python/condaDirectoryPath=<<path to conda installation>>
/instance/org.knime.dl.python/condaEnvironmentName=<<conda dl environment name, e.g. py3_knime_dl>>
/instance/org.knime.dl.python/kerasCondaEnvironmentDirectoryPath=<<path to keras conda environment>>
/instance/org.knime.dl.python/librarySelection=<<keras or tf2>>
/instance/org.knime.dl.python/tf2CondaEnvironmentDirectoryPath=<<path to tf2 conda environment>>
General advice for setting preferences on the executor: If you are not sure about the setting name to be provided in the epf file, you can export your KNIME Analytics Platform client’s preferences and use this as a template for the executor.epf.