Workflow execution with only partial nodes reset

I have a very large Knime Table used as a source dataset within a workflow. Loading the Knime Table takes a very long time due its size (c. 7 GB). The Table is static, it is very unlikely to change, or it will change very infrequently.

Is it possible to save or link workflows so that when executed from the command line, only the analytical componenents (model reading, scoring, outputs etc.) are reset and the the workflow doesnt need to “read” the massive table each time?

Thanks in anticipation of your help,


Hi @steven_preston,
I do not think that is possible, but what you can do easily within KNIME is to export your model to disk via this node:

Then you can create a lean scoring workflow, which just reads the saved model with the:


1 Like

If you execute a workflow with the batch executor it will take the workflow as saved and only execute the nodes that are not executed yet (unless you pass -reset which forces a reset of all nodes). Therefore as long as you save the workflow in the correct state everything should work as you intend.