External SSH Tool Failed

Hi

I’m trying to use the External SSH tool to transfer an input file to a remote LINUX machine, run a perl script on it and have the output returned to the node

The connection is good and I manage to transfer the input file, but then the node fails to exec

ute and I get these two messages:

“Job submission failed: No such file”

“Execute failed: No such file”

That happens even though I specify the directoried for the input, output and script execution locations

I would really apperciate any help that anyone can offer

I’m attaching the configuration of the node
Thanks!

Hi @Omri,

KNIME has no control over the execution of the external tool. There will be no progress, no failure message, and exceptions will not be caught.

So the first step would be to check if you have an appropriate permission for all the folders, if that’s the case, the second step would be to check the network traffic with your network administrator. If no problem will be detected there also, we can try to reproduce the problem using VM.

Best,
Anna

Hey Anna

Thank you so much for the reply.

There shouldn’t be any or traffic premission problems as this is my own directory which I use regularly.

In addition, the External SSH tool does manage to transfer the input from my local computer to the remote

server, It just fails to run the perl script.

If you can try to reproduce the problem I would really appreciate it

Thanks!

Omri

Hi @Omri,

Could you please verify if the files you’re calling/using inside the script exist and you have permissions to call them?

The KNIME node just translate the response of the system, so you have to check the correctness of the path to the script and the correctness of the script itself.

Best,
Anna

I realize this is an old thread, but I have seen this same behavior recently so some may find this helpful.

My observations are that the remote output file needs to already exist for the node to run properly. I use this approach often in chemical docking calculations and if a run fails, the output.sdf file is not created. I noted that the subsequent job in this event always crashes withe same error you mention here. If you create a blank output file (in your case test1.indexes.csv in the correct folder), you may find that the job will now run. Anna is correct however, that you do need to have read/write access to all the files.

Hope this helps someone.

1 Like