knimelogs.txt (85.3 KB)
I’m trying to upload a workflow consisting of two nodes that run python scripts. This workflow works if I run it in the Analytic Platform, but when I upload it to the server and try to run it from the webportal, I get the following errors:
Errors loading workflow ‘read&write_elastic’: Status: DataLoadError: read&write_elastic 0 loaded with error during data load
Status: DataLoadError: read&write_elastic 0
Status: DataLoadError: Python Source 0:1
Status: DataLoadError: Unable to load port content for node “Python Source”: Unknown table identifier: container_table_compressed
Status: DataLoadError: State has changed from EXECUTED to CONFIGURED
Status: Error: Python Script (1⇒1) 0:3
Status: Error: Loading model settings failed: Coding issue: No enum constant org.knime.python2.kernel.PythonKernelOptions.PythonVersionOption.python3
Status: DataLoadError: State has changed from CONFIGURED to IDLE
I’m sending the logs as well for the last couple hours.
I checked to see if the executor from the server had the python nodes extensions installed, which it has and the path to a python executable is also set, for the same version as the one I am using in my personal computer. I tried to download the workflow from the server onto my executor and got these same errors and when I looked into the nodes they were empty.
Something maybe worth noting, my executor and Analytics Platform versions are different. My Analytics Platform is 4.0.0 and my server executor is 3.7.2. Not sure if that affects anything.
Any help will be appreciated.