I’ve set up KNIME Server on an Ubuntu 20.04 installation and have gone through with the Python integration.
While everything seems to work fine, I’ve noticed that by default the working directory is to root folder.
At first this wasn’t an issue, but then when running a script that uses selenium I was getting permission errors due to python trying to write geckodriver.log to the root folder to which the user does not have write permission.
The issue can be resolved by changing the working directory.
I’ve tried to see whether I can change the default working directory, however this is set automatically upon starting up python and set to the directory it was started from.
Is there a way to fix this, maybe from preferences? Or is there something I should have done in my integration.
As a note both KNIME Server and Anaconda are located in the users filesystem, not root.
Thanks in advance,
In the upcoming version 4.3 of the KNIME Analytics Platform, the default working directory of each Python node will be set to the directory of the workflow it is contained in. I hope this already solves your problem.
If not: there is no specific Preferences entry or anything along those lines to customize the default directory. You could point KNIME to a start script that launches the Python executable with a custom working directory, though.
yes that would solve everything. Do you know of an ETA for version 4.3.
As for the temporary solution, can you direct me to a guide or something similar as to how to do it please?
Cool! Yes, 4.3 is scheduled for early December, so not too far from now, fortunately.
The workaround could be set up similarly to what is described under Option 2: Manual in our Python installation guide.
You would just need to slightly modify the shell script listed there like this (assuming you are using Conda environments):
# Start by making sure that the anaconda folder is on the PATH
# so that the source activate command works.
# This isn't necessary if you already know that
# the anaconda bin dir is on the PATH
conda activate py3_knime
python "$@" 1>&1 2>&2
(The change is in the second to last line. You may also need to change the name of the used Conda environment in the line before.)
One final thing, could this or something similar be happenining with the Send to Tableau Server node, as I also get this error:
Execute failed: Failed to start a new Hyper instance. Context: 0x86a93465 CAUSED BY: The Hyper server process exited during startup with exit code: 1 Command-line: “/home/analytics/Programs/knime_executor/plugins/org.knime.ext.tableau.hyperapi.bin.linux.x86_64_0.0.11074.v202006260728/tableauhyperapi-linux-x86_64-0.0.11074/lib/hyper/hyperd --database=/tmp/hyper_db_D67VnjMH --date-style=MDY --date-style-lenient=false --init=overwrite --init-user=tableau_internal_user --language=en_US --log-config=file,json,all,hyperd,0 --log-dir=/ --no-password=true --skip-license=true --listen-connection tab.domain:///tmp/domain/auto --callback-connection tab.domain:///tmp/domain/c91e7e4b42f347a4b8301015511c66b8 run” Child process’ stderr: Unable to open log file: open("/hyperd.log", 66): Permission denied FileListener in error state after initial rotate() Context: 0x13cead20
Thank in advance
Only reading the error message, I would say this could be a related error, yes. But I do not have any in-depth knowledge of our Tableau integration. I would suggest to open a dedicated forum post for that, and will reach out to our Tableau experts in the meantime .
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.