Using multiprocessing within a Python Script node

I was wondering if there would be any "gotchas" in trying to use python multiprocessing within a KNIME Python Script node?

I have a stand-alone script that uses mutiprocessing that I'd like to adapt for use within Knime. Simply placing the code within a Python Script Node fails with:

File "D:\Python27\lib\multiprocessing\", line 558, in get
    raise self._value
TypeError: expected string or Unicode object, NoneType found

One would normally protect the body of the script that initialises the data and sets up the multiprocessing pool with

if __name__ == '__main__':

I did change this to

if __name__ == '__builtin__':

which seems to be the correct(?) module name.

Any ideas?


I don't see why there should be any gotchas. Each node runs it's own python process. Theoretically you should be able to do anything you can do in the inpterpreter you have you python integration in KNIME pointed to (prefs).

However, depending on the type of your data, you might also be able to distribute it on serveral python nodes. In this case each "job" from a node would have it's own python interpreter process.

Not having the script, this is all I can tell you for now.


I ran into this issue just now. Multiproccessing does not seem to work from within the Knime Python node. In my case the behaviour with:

if __name__ == '__builtin__':

is identical to omitting it completely which means the node never executes and all python processes sit at 0 % CPU usage.

If I use

if __name__ == '__main__':

on the standalone script it works fine but that doesn't work from knime.