The External SSH Tool node can post Spark1.X commands which work off JDK 1.8 Versions but its not able to post Spark2 commands. I took a look at the log to see more details (Attached below).
Welcome. The stack trace is odd, given what you say, because it is saying “i can’t load this Spark class because its byte code is for Java8” (… but your post appears to be saying that you can execute Spark 1 commands because they are Java8.)
That stack trace would appear to be being generated by a JVM that is running Java 7 or earlier.
Thanks for the feedback, we are running 1.8 on our cluster. Also, the command line runs fine, but if the command is triggered through the SSH tool node. Is is possible that the KNIME node is changing something before posting the command on to the cluster?
Thanks for the help! how do I go about verifying
“Validate what java -version and env returns in the SSH node (the ssh node might not pickup the default ENV/PATH settings)”
In the External SSH Tool node? how do I find the output?
Enter java -version > /tmp/my-output.txt or env > /tmp/my-output.txt in the remote command field and /tmp/my-output.txt in the remote output file field. Now run the node and check the output.