I’m using KNIME 4.0.1 and using Livy-Spark node to create the spark context, when going through Table to Spark I’ve got a session expired after 50% when running the spark job process. I’m running KNIME on an EC2 instance with 8 cores and 16GB of memory, the spark context is created on AWS EMR and ports for livy and spark are opened. I’m also using the configuration for livy.server.session.timeout to 180000000 ms(livy.server.session.timeout: ‘180000000’) .
It goes to session expired and shows these errors:
ERROR Table to Spark 2:6 Execute failed: Bad Request: “requirement failed: Session isn’t active.” (LivyHttpException)
ERROR Table to Spark 2:6 Execute failed: Internal Server Error: “java.lang.IllegalStateException: RPC channel is closed.” (LivyHttpException)
ERROR Table to Spark 0:6 Execute failed: Internal Server Error: “java.util.concurrent.CancellationException” (LivyHttpException)
I thought that machine has low memory and should have more, RPC is not the problem because I’ve run before on local machine connecting to EMR and went further from Table to Spark node (spark.rpc.message.maxSize: 1024).
May this problem appear because of memory? (16GB)