Livy-Spark session expired on AWS EC2 to AWS EMR

Hello all,

I’m using KNIME 4.0.1 and using Livy-Spark node to create the spark context, when going through Table to Spark I’ve got a session expired after 50% when running the spark job process. I’m running KNIME on an EC2 instance with 8 cores and 16GB of memory, the spark context is created on AWS EMR and ports for livy and spark are opened. I’m also using the configuration for livy.server.session.timeout to 180000000 ms(livy.server.session.timeout: ‘180000000’) .
It goes to session expired and shows these errors:

ERROR Table to Spark 2:6 Execute failed: Bad Request: “requirement failed: Session isn’t active.” (LivyHttpException)
ERROR Table to Spark 2:6 Execute failed: Internal Server Error: “java.lang.IllegalStateException: RPC channel is closed.” (LivyHttpException)
ERROR Table to Spark 0:6 Execute failed: Internal Server Error: “java.util.concurrent.CancellationException” (LivyHttpException)

I thought that machine has low memory and should have more, RPC is not the problem because I’ve run before on local machine connecting to EMR and went further from Table to Spark node (spark.rpc.message.maxSize: 1024).

May this problem appear because of memory? (16GB)

Thanks,

I solved this issue from AWS EMR. The problem was in units count, but also I’ve added the KNIME client’s ip machine in Security Groups to allow all traffc. I had configured instance fleets on EMR configurations and set up to 5 units per instance, when I set it to 1 unit it worked.

SOLUTION:
Set IP in Security Groups to allow all traffic.
Set 1 unit per instance on Instance fleets on CORE node. (EMR Hardware configurations)

This thread can be closed

1 Like

Thanks for posting your solution! I’ve marked this thread as solved, which will close it automatically after a few days. (You can do this too by clicking the green check box below a post.)

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.