Execution Failure on Node Terminating Spark Session

Hello KNIME Support team.

I have created a WF on KNIME Server that connects with Livy to create a Spark Session, then proceeds to analyze via Pyspark Script and force terminate with Destroy Spark Context when finished.

No matter how long or short the analysis progresses, the session should terminate based on the quarter I specified, but the node fails as shown below.

This doesn’t happen every time, about 1 out of 3 times, the session termination node fails. The error message looks like this

Execute failed: java.net.SocketTimeoutException: Read timed out
java.lang.RuntimeException: java.net.SocketTimeoutException: Read timed out

The Destroy Spark Context node is only connected to Livy. What could be the problem? I would be grateful to know the answer.

Hello @JaeHwanChoi ,
Could you please provide the workflow with some dummy data and the log file so that we can reproduce it and understand the problem in a better way?

Thanks,
Sanket

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.