Cast exception after connecting 'create spark context' node to 'spark java snippet(source)' node


I'm trying to read a file from a remote server. If I use 'spark java snippet(source)' node alone, and put server settings into File- >Preferences- >Spark, everything works fine. I can read the file into RDD.

Then, I put the server settings into a 'create spark context' node, the node can still connect to spark job server just fine. However, after I connected the 'create spark context' node with the 'spark java snippet(source)' node and executed the flow, I got a cast exception. What did I do wrong? Thanks in advance.

The log:

java.lang.ClassCastException: com.knime.bigdata.spark.port.context.SparkContextPortObject cannot be cast to
	at com.knime.bigdata.spark.node.SparkNodeModel.execute(
	at org.knime.core.node.NodeModel.executeModel(
	at org.knime.core.node.Node.invokeFullyNodeModelExecute(
	at org.knime.core.node.Node.execute(
	at org.knime.core.node.workflow.NativeNodeContainer.performExecuteNode(
	at org.knime.core.node.exec.LocalNodeExecutionJob.mainExecute(
	at org.knime.core.node.workflow.NodeExecutionJob.internalRun(
	at org.knime.core.util.ThreadUtils$RunnableWithContextImpl.runWithContext(
	at org.knime.core.util.ThreadUtils$
	at java.util.concurrent.Executors$
	at org.knime.core.util.ThreadPool$
	at org.knime.core.util.ThreadPool$


you didn't do anything wrong. This is a bug in the Spark Java Snippet (Source) node. We will fix this problem with the first bug fix release. Unfortunately the release tomorrow will still have this problem since it is to late to fix it. Since the Spark Context input is optional you can use the node without an input which results in the node using the default Spark context defined in the Spark preferences. I hope this workaround is ok with you until the bug fix release.



I see, thanks for the reply.