[Solved]Number of Spark Context

Hi there

I have some problem… When I create more than two job server using “Create Spark Context(Jobserver)Node”.

there cannot create more than two job server in knime?

Hi @HyojungPark,
what is the problem in detail, can you provide an error message?
With the “Create Spark Context” node, you do not create a job server but a context in Spark through an existing Jobserver.
In general it is possible to have different contexts in Spark.

best regards Mareike

There would be several reasons for this message:

  • your admin has restricted the number of Spark contexts you are allowed to create (in that case it should be possible to access an existing one)
  • you are trying to create a new spark context with the same name as the old one and this is not working

Have you tried to give your second spark context a different name? And indeed the precise error message would shed further light on your problem.

I’m sorry to confuse you.

When I create more than two spark context using “Create Spark Context(Jobserver)” Node, an error occurs.

If I want to create another contexts in spark (Without connecting to an existing context) , Is there any settings I need to change?

and additional
I tried after renaming the spark context .

Hi @HyojungPark
I am afraid without the error message it is basically impossible to help you with this. Could you please add the error message.

Did you take a look into the knime.log (You can view the KNIME.log via View>“Open KNIME log”), it might give some more information about the error.

Hi, there

I succeed to make 2 spark context ! such as photo.
But When I tried to make third spark context, error occured.


Here are log messages.


2019-04-15 17:50:15,433 : DEBUG : KNIME-Worker-35 : Node : Create Spark Context (Jobserver) : 0:850 : Execute failed: Spark Jobserver gave unexpected response For details see View > Open KNIME log… Possible reason: Incompatible Jobserver version, malconfigured Spark Jobserver
org.knime.bigdata.spark.core.exception.KNIMESparkException: Spark Jobserver gave unexpected response For details see View > Open KNIME log… Possible reason: Incompatible Jobserver version, malconfigured Spark Jobserver
at org.knime.bigdata.spark.core.sparkjobserver.request.AbstractJobserverRequest.createUnexpectedResponseException(AbstractJobserverRequest.java:164)
at org.knime.bigdata.spark.core.sparkjobserver.request.AbstractJobserverRequest.handleGeneralFailures(AbstractJobserverRequest.java:133)
at org.knime.bigdata.spark.core.sparkjobserver.request.CreateContextRequest.sendInternal(CreateContextRequest.java:78)
at org.knime.bigdata.spark.core.sparkjobserver.request.CreateContextRequest.sendInternal(CreateContextRequest.java:1)
at org.knime.bigdata.spark.core.sparkjobserver.request.AbstractJobserverRequest.send(AbstractJobserverRequest.java:72)
at org.knime.bigdata.spark.core.sparkjobserver.context.JobserverSparkContext.createRemoteSparkContext(JobserverSparkContext.java:333)
at org.knime.bigdata.spark.core.sparkjobserver.context.JobserverSparkContext.open(JobserverSparkContext.java:198)
at org.knime.bigdata.spark.core.context.SparkContext.ensureOpened(SparkContext.java:143)
at org.knime.bigdata.spark.node.util.context.create.SparkContextCreatorNodeModel.executeInternal(SparkContextCreatorNodeModel.java:115)
at org.knime.bigdata.spark.core.node.SparkNodeModel.execute(SparkNodeModel.java:240)
at org.knime.core.node.NodeModel.executeModel(NodeModel.java:567)
at org.knime.core.node.Node.invokeFullyNodeModelExecute(Node.java:1186)
at org.knime.core.node.Node.execute(Node.java:973)
at org.knime.core.node.workflow.NativeNodeContainer.performExecuteNode(NativeNodeContainer.java:559)
at org.knime.core.node.exec.LocalNodeExecutionJob.mainExecute(LocalNodeExecutionJob.java:95)
at org.knime.core.node.workflow.NodeExecutionJob.internalRun(NodeExecutionJob.java:179)
at org.knime.core.node.workflow.NodeExecutionJob.run(NodeExecutionJob.java:110)
at org.knime.core.util.ThreadUtils$RunnableWithContextImpl.runWithContext(ThreadUtils.java:328)
at org.knime.core.util.ThreadUtils$RunnableWithContext.run(ThreadUtils.java:204)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.knime.core.util.ThreadPool$MyFuture.run(ThreadPool.java:123)
at org.knime.core.util.ThreadPool$Worker.run(ThreadPool.java:246)

Thanks !

This might be just a resource issue. A SparkContext will demand an amount of resources in the cluster. If the cluster does not have enough resources available, it will not be able to create another context. Did you check whether the cluster has enough resources available?

Just out of curiosity: Why do you need to create several SparkContexts in one workflow?

best reagrds Mareike

Thank u for your concern.
I will check the resource.
The reason that we try to create several SparkContexts because we are testing it for the customer.


Plus I have one more question.
‘Spark job server’ can only be installed on name node ?
or can be installed on data nodes?

In manual, ‘Spark job server’ needs to be installed on an edge node or a node that can execute the ‘spark-submit’ command.

I’m confused that sentence…

thanks :smile:

You should only install it on the name node. The node that is reachable from outside of the cluster.

A little side note here: We now recommend the usage of Livy instead of Spark Job Server. We have the “Create Spark Context (Livy)" node to connect to it.

best mareike