Python KNIMEPY - passing variables


I want to try the KNIMEPY toolkit to run a KNIME workflow (on the server) via an external Python IDE.

First of all it is really great that someone is putting a great deal of effort to provide such a crucial KNIME-Python implementation.

One thing I still was not able to find out is, if there is an option to pass variables via external Python IDE to the respective KNIME workflow using KNIMEPY.

I found a forum discussion and followed the links there but I was not able to find a solution yet:

Is there any update on this matter?

Would be great if someone could point me to a solution.



P.s. any other solution to run a KNIME server workflow via python would also be great.

Hi Anjo,

as far as we know, that is not possible so far.

Best regards


I must admit that I halfway expected it from what I found on GitHub and the respective forum entries. May I asked, is KNIMEPY a Knime toolkit that is “fully KNIME fledged” and is still going to be further developed. I mean the integration for Jupyter Notebooks and potential other applications is great and I would really hope for some additional features in the future. If I can upvote this somewhere please let me know.

Many thanks.

Hi @Anjo,

yes and no :slight_smile: Yes, knimepy is a toolkit provided by KNIME and we will make sure that it continues to work while we are improving the KNIME Analytics Platform. But no, we are currently not planning to add new features to knimepy.

Regarding your specific feature request: it would be difficult to integrate knimepy with each IDE that can run Python code. However, what do you think about this workaround:

  • add another Container Input (Table) node to the workflow that you want to remote control
  • put all parameters to pass to your workflow in a separate pandas DataFrame (e.g. by loading a config.csv file?)
  • use the contents of this table in your workflow as parameters, e.g. by turning the data in the table into KNIME flow variables (“Table Row to Variable” node)

Hope that helps,


Dear Carsten,

many thanks for your reply, we also thought about the option of using the combination of Container Input(Table) and “Table Row to Variable” node. Being able to use the Container Input(Variable) node via knimepy would have just made the implementation easier as we could have used the developed workflow (tested in production) as it is.

Nevertheless we will go with this approach and see how it works for us.

I would like to ask some associated questions here under this topic as they somehow go into the same direction:

1. Is there a full documentation of the KNIMEPY toolkit available?
The knimepy GitHub (GitHub - knime/knimepy) and the Blog ( are great places to start with some examples but I missed to find a complete documentation of the available functions in the toolkit

2. Defining the input and outputs for multiple data tables:
Is there a way to adress a specific input or output Container table via the associated uniqueID?
So far there seems not to be to much control when e.g. using the following statements from Github:

with knime.Workflow(r"C:\Users\berthold\knime-workspace\ExploreData01") as wf:
    wf.data_table_inputs[0] = input_table_1
    wf.data_table_inputs[1] = input_table_2
    output_table = wf.data_table_outputs[0]  # output_table will be a pd.DataFrame

I am refering to the answer given by potts in the discussion here: Using KNIME Workflows in Jupyter Notebooks - #6 by potts, especially the part where he mentions the functional apis (“run_workflow_using_multiple_service_tables()”) and attributes (“data_table_inputs_parameter_names”). Are there any code examples related to this?

3. Similar, what do the following expressions do?

  • wf.data_table_outputs[:] compared to wf.data_table_outputs[0]
import knime

with knime.Workflow("DemoWorkflow01") as wf:
    results = wf.data_table_outputs[:]
  • wf.data_table_inputs[:] compared to wf.data_table_inputs[0]
import knime
import pandas as pd

input_table_1 = pd.DataFrame([["blau", -273.15], ["gelb", 100.0]], columns=["color", "temp"])

# Requires a valid user account on a running KNIME Server instance.
with knime.Workflow(
) as wf:
    wf.data_table_inputs[:] = [input_table_1]
    wf.execute(reset=True, timeout_ms=10000)  # Default timeout is usually plenty.
    output_table = wf.data_table_outputs[0]

3. I get an error when using the option “live_passthru_stdout_stderr” in a jupyter notebook
Referring to potts answer here: Error when using knimepy from jupyter - #2 by potts

Some of the questions above may have been asked under different topics in the KNIME forums but I could find no answer there as well.

It would be great if somewhone as an answer or further information.

Thanks a lot.



A little update:

I played around with the “Container Input (Variable)” node and the KNIMEPY toolkit. As it seems you are actually able to pass variables via KNIMEPY. You just have to provide the input as a dictionary and set the template specificitations in the Container Input node.

That is great actually!

Another thing that is quite strange and I cannot find a solution for is the following: when I place a “String Input” Quickform node in the workflow I get an error in the jupyter notebook:

“ERROR:root:failure during remote job execution, status_code=500, text=‘Could not set input on job '95f30ff3-6dfc-49f9-a95c-83474f00ac95': Invalid JSON parameter for node “string-input”, cannot read provided JSON input: Expected JSON object, JSON string or JSON number, but got NULL\n’”

When I replace the Quickform with the String Configuration node, it works.

If anyone has an solution/explanation for this it would be great (or I would have to replace a great deal of Quickforms in the actual workflow)



This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.