KNIME Server Python Execution

Hi everyone,

I have a worfklow containing Python nodes. It successfully works on analytics platform, but when I run it on server, knime server (azure) returns a error code shown below.

ERROR Python Script 0:76:0:65: Execute failed: Could not start Python kernel. Error during Python installation test: Could not start Python executable at the given location (no_conda_environment_selected\python.exe): Cannot run program “no_conda_environment_selected\python.exe”: CreateProcess error=2, The system cannot find the file specified

For me, I have completed all Python integration steps on knime server. What is the reason of this error?

Any help would be appreciated,

Best,

-Kerem

Hi,

The Error message shows that the KNIME Executor tries to access the default dummy path, so your Python environmental settings are not used.

Could you please check, if the settings are done in the executor.epf file and not as in older KNIME Server versions (until version 4.10) usual in the preferences.epf?

You can also use additional client profiles if you link it afterwards in the knime.ini of the Executor.

Documentation: KNIME Server Administration Guide

Hope this helps,
Best,

Michael

Hi @MichaelRespondek,

Can I export preferences of my local analytics platform and import this preferences file into knime server? Will it works?

Best,

Yes, it will. Export the whole preferences of the KNIME Analytics Platform and add the lines regarding python to the executor.epf. You only have to change paths to met the servers Python environment.

Best,
Michael

Hi @KKERROXXX,

several weeks ago I had the same issue.

Solution was to edit the *.epf-file as follows:

org.knime.dl.python/pythonEnvironmentType=manual
org.knime.dl.python/manualConfig=/home/knime/anaconda3/envs/knime_dl_keras/bin/python
org.knime.dl.python/tf2ManualConfig=/home/knime/anaconda3/envs/knime_dl_tf2/bin/python

Best,
Sven

2 Likes

How does this work in Distributed environment? I have server and executor on separate machine? Do I still need to export .epf file on server ?

Each executor will have its knime.ini which would contain the -profileLocation and -profileList directives; then they will reach out via REST to obtain those preferences files from the server, which will server them up from the workflow_repository/client-profiles directories.

You would of course need each executor machine to have the relevant software (anaconda, python) installed in the same location so that the directory pointed to by the preferences files can be found by the executor.

2 Likes

Thanks that helped…

Is it also possible to have different preference for each executor in distributed environment? Say If I have 5 executor and one server. would it be possible to have 5 different client profiles? if yes then how ? and would executor know which profile to pick?

Documentation [1] on client-profiles kind of answers that, but may be a little confusing if you don’t already understand the system.

The TLDR really is that you create a setup in which

  1. each preference collection exists in a subdirectory under workflow_repository/config/client-profiles;
  2. each subdirectory contains a .epf file with the same name as the directory, as well as any necessary ancillary files such as jar files.
  3. each subdirectory’s epf file contains the directives for the executor.

On executor start, the knime.ini -profileList and -profileLocation directives tell it where to get preferences and which profiles to pull; then it combines them into a single combined-preferences.epf and loads the executor with the combined preferences list to handle any jdbc drivers, db connections, etc that those directives add.

So you can have as many separate profiles under client-profiles that you want, and you can tell each executor to either pull the same ones (if you want them to be homogenous) or different ones (if you’re using them for different purposes, for example, workflow pinning and executor groups for different departments or purposes.)

[1] KNIME Server Administration Guide

1 Like