“Local file system access by KNIME workflows” – e.g. Python Nodes

im using python to read/write/create files and folders on network shares e.g.


Regarding the server Update Guide “local file system access is disallowed by default (previously it was allowed).”

I’ve added the following line:


to the executor profile settings file.

But it’s still not working.

Server-Version: 4.13.4
Executer-Version: 4.4.2

Any suggestions?
Greetings Tim

Hello Tim,

Did you test this on a fresh installation of the KNIME Server? Was it working before on an older KNIME Server instance?

Please check if the account which is configured to run the KNIME Executor Service is allowed to access the share. Using the “Local System Account” could restrict or block the permissions to access domain resources like network shares. Therefore a Domain Service Account is often used to run the Executor Service instead.


1 Like

Hello @MichaelRespondek ,
sorry for late reply…
it’s a new installation, everything works fine on the old server.
if I take a normal list files node and enter \servername\ folder, I have no access,
if I use the smb-connector (user == executer user) then it works.
So the user has access to the folder
But that doesn’t help me with the python nodes…
I don’t have a domain service account…

I created a test workflow to create a folder with the “Create Folder Node” on a share without any access restrictions. The owner of this folder is not the executor user but a user called “computername-of-the-knime-server$”…

Hello TIm,

This all indicates that you would need a domain service account to run the executor to be able to access network shares. The folder name shows your computername as the local system/service account is only valid on the machine itself. Therefore it is authenticated as the computer itself without any domain rights.

Maybe your IT department would create a KNIME Server Service domain account for you to access domain ressources, you should ask them. In addition as you wrote that everything worked before on the old server: Could it be that the old KNIME Server version used a domain service account already? That would explain the differences in the access permissions.


Hello @MichaelRespondek
Hmm i don’t know. i need to check tomorrow and ask the admin of the server.
If i create a folder with the old server (create directory node), the owner is of the folder is “executer user”…

I am pretty sure that this used “executer user” is a domain service account and that this is running at least the old KNIME Server Executor Service. As the access to the shares worked with this account I would ask your IT department to switch the “log on as” option within the Executor service to this user.

Hi @MichaelRespondek,
with the “Log on as” option it should work.
But of course this is not a permanent solution. Access to the local file system was of course not switched off without reason :slight_smile:
Is planned that the e.g. Python nodes also get a FileSystem Connector Port?

I don’t even know if that’s possible with the connector port.
Anyway, here’s a solution directly with Python:

from smb.SMBConnection import SMBConnection
import socket

file = r’c:\temp\test.txt’
fileobj = open(file, ‘rb’)

mycomp = socket.gethostname()




conn = SMBConnection(Username, Password, mycomp, RemoteHostname, use_ntlm_v2 = True, is_direct_tcp=True)
ip_address = socket.gethostbyname(RemoteHostname)

conn.connect(ip_address, 445)

share = ‘xxxxx’
path = ‘xxxxxxxxxx’
conn.storeFile(share, path, fileobj)


1 Like

Thank you Tim for sharing your solution.
I don’t think that a file connector port would work for the Python Nodes as our Connector Nodes are Java and KNIME specific. What you could try is to use the new Python Labs Script node (available in Analytics Platform 4.5.0+). Used in combination with the Apache Arrow/Columnar Backend framework it should be easy to use a KNIME Connector node and use the fetched data within the Python node.
Details and an example are all linked in the What’s new article above.