Spark Context Livy failed

Title: Error Creating Spark Context (Livy): Execution Failed

Dear KNIME users,

I’m currently encountering an error while attempting to create a Spark context (Livy) in KNIME Analytics Platform. The specific error message is as follows:

ERROR Create Spark Context (Livy) 3:20:100 Execute failed: Failed to upload Kryo version detector job: Bad Request: {“msg”:“requirement failed: Local path /home/umnia/.livy-sessions/e6d67172-1626-4061-8012-5b66cf9ec816/livy-kryo-version-detector.jar cannot be added to user sessions.”} (LivyHttpException)

I’ve tried resolving this issue by checking file permissions and examining configuration settings, but so far, I haven’t been successful in resolving it.

If anyone has encountered this issue before or has any suggestions on how to fix it, I would greatly appreciate any assistance you can offer.
Console output:

Log file is located at: /home/umnia/knime_5.2.0.linux.gtk.x86_64/knime_5.2.0/knime-workspace/.metadata/knime/knime.log
ERROR Simple Preprocessing 2:12 Unable to load node with ID suffix 6 into workflow, skipping it: null
ERROR Extended NER Preprocessing 2:13 Unable to load node with ID suffix 8 into workflow, skipping it: null
ERROR Extended NER Preprocessing 2:13 Unable to load node with ID suffix 9 into workflow, skipping it: Could not initialize class org.knime.ext.textprocessing.nodes.tagging.opennlpner.OpennlpNerTaggerNodeModel2
ERROR Extended NER Preprocessing 2:13 Unable to load node with ID suffix 10 into workflow, skipping it: Could not initialize class org.knime.ext.textprocessing.nodes.tagging.opennlpner.OpennlpNerTaggerNodeModel2
ERROR Extended NER Preprocessing 2:13 Unable to load node with ID suffix 11 into workflow, skipping it: Could not initialize class org.knime.ext.textprocessing.nodes.preprocessing.stopwordfilter.BuildInStopwordListFactory
WARN Local File System Connector 3:20:98 Connection no longer available. Please re-execute the node.
WARN Create Spark Context (Livy) 3:20:100 Execute failed: Failed to upload Kryo version detector job: Bad Request: {“msg”:“requirement failed: Local path /home/umnia/.livy-sessions/b26db3d0-30b3-4b1d-9408-79cc3d5abcbf/livy-kryo-version-detector.jar cannot be added to user sessions.”} (LivyHttpException)
WARN FontStore Using the system default font for annotations: Font {140600322321136}
WARN Local File System Connector 4:20:98 Connection no longer available. Please re-execute the node.
WARN Create Spark Context (Livy) 4:20:100 Execute failed: Failed to upload Kryo version detector job: Bad Request: {“msg”:“requirement failed: Local path /home/umnia/.livy-sessions/b26db3d0-30b3-4b1d-9408-79cc3d5abcbf/livy-kryo-version-detector.jar cannot be added to user sessions.”} (LivyHttpException)
ERROR Create Spark Context (Livy) 3:20:100 Execute failed: Failed to upload Kryo version detector job: Bad Request: {“msg”:“requirement failed: Local path /home/umnia/.livy-sessions/e6d67172-1626-4061-8012-5b66cf9ec816/livy-kryo-version-detector.jar cannot be added to user sessions.”} (LivyHttpException)

Thank you in advance for your help.

Hi @mexo_do,

Welcome to the KNIME community!

The Livy node requires some network filesystem that can be accessed by the KNIME AP and Livy. Usually, you should use the HDFS or S3 connector as input of the Create Spark Context (Livy) node.

Cheers,
Sascha

2 Likes