[Create Spark Context(livy)] Execute failed: scala/Serializable (java.lang.NoClassDefFoundError) (Exception)

Hi,
I tried to create spark context with livy and a spark on yarn cluster.
HDFS connection works, and livy connection is finished, but “Create Spark Context” node gives this error.
I installed scala on livy server.
Spark version is 3.4.1 and scala version is 2.13.14
Here’s images that provide you information.

Should I reinstall the Scala packages? or the Spark packages?
or Do I need modify Configuration file?

I tried to figure out to fix this error, but it’s hard
Please guide me.

Hi @yun08,

You have to use the Scala 2.12.x version, this is the 2.12 in the name of the Livy zip file. On the Spark download page, select the version without 2.13: spark-3.4.3-bin-hadoop3.tgz
This includes Scala already, and you don’t need to download it separately.

Cheers,
Sascha

1 Like

Thanks to your help.
I configured spark cluster again, but I got another issue.
I removed scala 2.13.x version, and downloaded Spark;spark-3.4.1-bin.hadoop3.tgz
when I get started, scala version is 2.12.17
But the node[Create Spark Context(Livy)] send me error.

ERROR Create Spark Context (Livy) 3:14 Execute failed: Failed to connect to master/220.220.220.18:7077 (java.io.IOException) (Exception)

How can I solve it?

I encountered new issue when I get started the node.

ERROR Create Spark Context (Livy) 3:14       Execute failed: Staging area access from inside Spark failed: Permission denied: user=livy, access=WRITE, inode="/user/knime/jy/knime-spark-staging-613ca32c0b3c44af":knime:knime:drwxr-xr-x
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:506)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:346)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:242)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1943)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1927)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1886)
	at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.resolvePathForStartFile(FSDirWriteFileOp.java:323)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2685)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2625)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:807)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:496)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:621)
	at org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:589)
	at org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:573)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1227)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1094)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1017)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:3048)

I changed the permissions of the ‘/user’ file by using ‘chmod 777’ and tried to set the staging directory in the advanced tab of the ‘Create Spark Context (Livy)’ node under the option ‘Set staging area for Spark jobs’.

What am I missing? Please help me.

I guess I have to remove and remake folders that the node meet, because I installed spark and configured again.
Is this the appropriate method?

The command ‘bin/hdfs dfs -chmod -R 777 /path’ seems to have worked on the master server, but I encountered the same error:

Permission denied: user=livy, access=WRITE, inode="/path/to":knime:supergroup:drwxr-xr-x

The node (Create Spark Context (Livy)) fails to execute, and the permission ‘drwxr-xr-x’ remains unchanged. Despite creating a new staging area, changing permissions(chmod 777), and configuring the node for testing, the issue persists.

What can I do?
Please help me. I need to complete this task.

Your Livy server is running as user livy, but your KNIME is running as user knime. Suggesting to use impersonation (or try run livy and KNIME as the same user if this is only a testing env).

1 Like