Create Local Big Data Environment - Error

Hello KNIME Team.

Im trying to execute the node "Create Local Big Data Environment " and i facing the follow error:

2022-07-23 14:40:35,697 : ERROR : KNIME-Worker-10-Create Local Big Data Environment 3:7 :  : Node : Create Local Big Data Environment : 3:7 : Execute failed: class **org.apache.spark.storage.StorageUtils$ (in unnamed module @0x5a690a6b) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x5a690a6b**
java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x5a690a6b) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x5a690a6b
	at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:213)
	at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
	at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:110)
	at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:348)
	at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:287)
	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:336)
	at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:191)
	at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:460)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
	at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
	at scala.Option.getOrElse(Option.scala:189)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
	at org.knime.bigdata.spark.local.wrapper.LocalSparkWrapperImpl.openSparkContext(LocalSparkWrapperImpl.java:329)
	at org.knime.bigdata.spark.local.context.LocalSparkContext.open(LocalSparkContext.java:159)
	at org.knime.bigdata.spark.core.context.SparkContext.ensureOpened(SparkContext.java:145)
	at org.knime.bigdata.spark.local.node.create.AbstractLocalEnvironmentCreatorNodeModel.executeInternal(AbstractLocalEnvironmentCreatorNodeModel.java:142)
	at org.knime.bigdata.spark.core.node.SparkNodeModel.execute(SparkNodeModel.java:240)
	at org.knime.core.node.NodeModel.executeModel(NodeModel.java:549)
	at org.knime.core.node.Node.invokeFullyNodeModelExecute(Node.java:1267)
	at org.knime.core.node.Node.execute(Node.java:1041)
	at org.knime.core.node.workflow.NativeNodeContainer.performExecuteNode(NativeNodeContainer.java:595)
	at org.knime.core.node.exec.LocalNodeExecutionJob.mainExecute(LocalNodeExecutionJob.java:95)
	at org.knime.core.node.workflow.NodeExecutionJob.internalRun(NodeExecutionJob.java:201)
	at org.knime.core.node.workflow.NodeExecutionJob.run(NodeExecutionJob.java:117)
	at org.knime.core.util.ThreadUtils$RunnableWithContextImpl.runWithContext(ThreadUtils.java:367)
	at org.knime.core.util.ThreadUtils$RunnableWithContext.run(ThreadUtils.java:221)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
	at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
	at org.knime.core.util.ThreadPool$MyFuture.run(ThreadPool.java:123)
	at org.knime.core.util.ThreadPool$Worker.run(ThreadPool.java:246)

Im using KNIME 4.6.0 in Windows 10 machine.

Thank you in advance

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.