I’m trying to use the External SSH tool to transfer an input file to a remote LINUX machine, run a perl script on it and have the output returned to the node
The connection is good and I manage to transfer the input file, but then the node fails to exec
KNIME has no control over the execution of the external tool. There will be no progress, no failure message, and exceptions will not be caught.
So the first step would be to check if you have an appropriate permission for all the folders, if that’s the case, the second step would be to check the network traffic with your network administrator. If no problem will be detected there also, we can try to reproduce the problem using VM.
Could you please verify if the files you’re calling/using inside the script exist and you have permissions to call them?
The KNIME node just translate the response of the system, so you have to check the correctness of the path to the script and the correctness of the script itself.
I realize this is an old thread, but I have seen this same behavior recently so some may find this helpful.
My observations are that the remote output file needs to already exist for the node to run properly. I use this approach often in chemical docking calculations and if a run fails, the output.sdf file is not created. I noted that the subsequent job in this event always crashes withe same error you mention here. If you create a blank output file (in your case test1.indexes.csv in the correct folder), you may find that the job will now run. Anna is correct however, that you do need to have read/write access to all the files.