LZ4 Class could not be loaded

Hallo everyone,

I hope all of you are doing well!

I am struggling with the following situation and maybe one of you knows how to deal with it.

On our platform, we have a service user who starts KNIME Workflows with a batch file. Sometimes there are multiple workflows which are running at the same time.

Unfortunately, some of our executions failed with the following error message. The KNIME Workflow contains a Python Script Node.

2024-04-09 05:11:46.450+0200 [STDERR] Info: Loading class org.bytedeco.javacpp.presets.javacpp
Warning: Could not load class org.bytedeco.javacpp.presets.javacpp: java.lang.UnsatisfiedLinkError: java.io.FileNotFoundException: C:\Users\KNIME_RUNNER.javacpp\cache\windows-x86_64\jnijavacpp.dll (The process cannot access the file because it is being used by another process)
Info: Loading class org.bytedeco.lz4.global.lz4
Warning: Could not load class org.bytedeco.lz4.global.lz4: java.lang.UnsatisfiedLinkError: java.io.FileNotFoundException: C:\Users\KNIME_RUNNER.javacpp\cache\windows-x86_64\jnilz4.dll (The process cannot access the file because it is being used by another process)
2024-04-09 05:11:49.949+0200 [STDERR] ERROR KNIME-Worker-4-Python Script (Labs) 3:10683:10686 Node Execute failed: An exception occured while running the Python kernel. See log for details.
org.knime.python2.kernel.PythonIOException: An exception occured while running the Python kernel. See log for details.
at org.knime.python3.scripting.Python3KernelBackend.putDataTable(Python3KernelBackend.java:418)
at org.knime.python2.kernel.PythonKernel.putDataTable(PythonKernel.java:303)
at org.knime.python2.ports.DataTableInputPort.execute(DataTableInputPort.java:116)
at org.knime.python3.scripting.nodes.AbstractPythonScriptingNodeModel.execute(AbstractPythonScriptingNodeModel.java:196)
at org.knime.core.node.NodeModel.executeModel(NodeModel.java:549)
at org.knime.core.node.Node.invokeFullyNodeModelExecute(Node.java:1267)
at org.knime.core.node.Node.execute(Node.java:1041)
at org.knime.core.node.workflow.NativeNodeContainer.performExecuteNode(NativeNodeContainer.java:595)
at org.knime.core.node.exec.LocalNodeExecutionJob.mainExecute(LocalNodeExecutionJob.java:95)
at org.knime.core.node.workflow.NodeExecutionJob.internalRun(NodeExecutionJob.java:201)
at org.knime.core.node.workflow.NodeExecutionJob.run(NodeExecutionJob.java:117)
at org.knime.core.util.ThreadUtils$RunnableWithContextImpl.runWithContext(ThreadUtils.java:367)
at org.knime.core.util.ThreadUtils$RunnableWithContext.run(ThreadUtils.java:221)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
at org.knime.core.util.ThreadPool$MyFuture.run(ThreadPool.java:123)
at org.knime.core.util.ThreadPool$Worker.run(ThreadPool.java:246)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.bytedeco.lz4.global.lz4
at org.knime.core.columnar.arrow.compress.Lz4FrameCompressionCodec.doCompress(Lz4FrameCompressionCodec.java:114)
at org.knime.core.columnar.arrow.compress.Lz4FrameCompressionCodec.compress(Lz4FrameCompressionCodec.java:98)
at org.knime.core.columnar.arrow.ArrowReaderWriterUtils.compressAllBuffers(ArrowReaderWriterUtils.java:127)
at org.knime.core.columnar.arrow.ArrowBatchWriter.createRecordBatch(ArrowBatchWriter.java:399)
at org.knime.core.columnar.arrow.ArrowBatchWriter.writeVectors(ArrowBatchWriter.java:381)
at org.knime.core.columnar.arrow.ArrowBatchWriter.write(ArrowBatchWriter.java:197)
at org.knime.core.columnar.data.dictencoding.DictEncodedBatchWriter.write(DictEncodedBatchWriter.java:204)
at org.knime.core.data.columnar.table.WrappedBatchStore$WrappedBatchWriter.write(WrappedBatchStore.java:123)
at org.knime.core.columnar.cursor.ColumnarWriteCursorFactory$ColumnarWriteCursor.writeCurrentBatch(ColumnarWriteCursorFactory.java:187)
at org.knime.core.columnar.cursor.ColumnarWriteCursorFactory$ColumnarWriteCursor.flush(ColumnarWriteCursorFactory.java:178)
at org.knime.core.data.columnar.table.ColumnarRowWriteCursor.flush(ColumnarRowWriteCursor.java:137)
at org.knime.core.data.columnar.table.ColumnarRowWriteTable.finish(ColumnarRowWriteTable.java:177)
at org.knime.python3.arrow.PythonArrowDataSourceFactory.copyTable(PythonArrowDataSourceFactory.java:198)
at org.knime.python3.arrow.PythonArrowDataSourceFactory.copyTableToArrowStore(PythonArrowDataSourceFactory.java:180)
at org.knime.python3.arrow.PythonArrowDataSourceFactory.extractStoreCopyTableIfNecessary(PythonArrowDataSourceFactory.java:171)
at org.knime.python3.arrow.PythonArrowDataSourceFactory.createSource(PythonArrowDataSourceFactory.java:121)
at org.knime.python3.scripting.Python3KernelBackend$PutDataTableTask.call(Python3KernelBackend.java:676)
at org.knime.python3.scripting.Python3KernelBackend$PutDataTableTask.call(Python3KernelBackend.java:1)
at org.knime.core.util.ThreadUtils$CallableWithContextImpl.callWithContext(ThreadUtils.java:383)
at org.knime.core.util.ThreadUtils$CallableWithContext.call(ThreadUtils.java:269)
at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Suppressed: java.lang.IllegalStateException: Memory was leaked by query. Memory leaked: (32)
Allocator(ArrowColumnStore) 0/32/5348/9223372036854775807 (res/actual/peak/limit)
at org.apache.arrow.memory.BaseAllocator.close(BaseAllocator.java:437)
at org.knime.core.columnar.arrow.AbstractArrowBatchReadable.close(AbstractArrowBatchReadable.java:100)
at org.knime.core.columnar.arrow.ArrowBatchStore.close(ArrowBatchStore.java:113)
at org.knime.core.columnar.data.dictencoding.DictEncodedBatchWritableReadable.close(DictEncodedBatchWritableReadable.java:108)
at org.knime.core.data.columnar.table.WrappedBatchStore.close(WrappedBatchStore.java:222)
at org.knime.core.data.columnar.table.DefaultColumnarBatchStore.close(DefaultColumnarBatchStore.java:349)
at org.knime.core.data.columnar.table.ColumnarRowWriteTable.close(ColumnarRowWriteTable.java:218)
at org.knime.python3.arrow.PythonArrowDataSourceFactory.copyTable(PythonArrowDataSourceFactory.java:200)
ā€¦ 11 more

Maybe one of you has a creative solution for this error.

Best wishes,
Finn

Hi Finn @FinnR,

unfortunately we are not aware of any workaround if you want to use multiple KNIME Analytics Platforms at once in a scenario using Python Script nodes on a Windows machine. A Unix OS, on the other side, should not have this issue.

Best regards
Steffen

1 Like

Hi @steffen_KNIME,

thank you very much for your response. We will consider the opportunity to change the OS from Windows to Linux for our ETL servers.

I also found a .lock file in the following folder: C:\Users\KNIME_RUNNER.javacpp\cache

Do you think I should delete this file, or should I keep my hands off it? I will also find out if the error occurs only when two Python script nodes are running at the same time.

Best regards
Finn

Dear @FinnR,

I do not think the existence of the .lock file affects the issue.

Let us know if you have any further updates!

Thanks
Steffen