High Sierra and Node Out zip files


I have a new issue since yesterday, this time it's with XLS reader. I must say that i have upgraded from Sierra to High Sierra yesterday.

I already have reinstalled from scratch with the 3.4.1, but problem remains.

In the attachment, you will see what i get.

I can create a worfklow, read an excel file, and see what was loaded.

After i save and reopen the workflow, the node out_port data are not accessible anymore.

This is a big issue for use since we cannot go back to Sierra to check how this was before / we are pretty sure it has never happened with Sierra.

Please have a look.

Thanks in advance

Kind regards

I've just tried your steps on one of my Macs with macOS 10.13 + KNIME 3.4.1 and cannot confirm these issues. The workflow loads just fine and also the data.zip file can be extracted without any problems.

Sorry, I cannot give any input on the actual issue. But probably macOS 10.13 is not the culprit.

-- Philipp



Just to make sure : you have read the pdf that i attached and you performed the save and open again ? WITH AN EXCEL file ?

This is a nightmare, trust me. I really don't know were to dig :(

Thanks anyway


Hi again,

on thing to mention. I have the very same behavior that i described in the PDF on one IMAC +on ON MACBOOK PRO. Both are now in High Sierra. Therefore i don't believe it's something like a bad setting i could have done somewhere. 

I have upgraded to high Sierra, then found the problem. Then full reinstalled Knime in 3.4.1. Problem still there. Then i have upgraded the Macbook pro. And it's the same

What can i do now ?




Yes, I've used an Excel file, as you've described in attached PDF, just to double check and didn't see any issues.

However: I'll be glad to give it a go, if you want to supply a sample workflow?

Have you installed any 3rd party applications, which might interfere?


I am so desperate that i am working at night :)

I found something. 

With small files all is OK.

It has nothing to do with XLS Node. It can get wrong with any node, such as Txt reader Node.

To illustrate one new workflow, reading one text file with 40 rows, and one with 30000 rows, and reading one xls file with 40 rows and one with 30000 rows.

Make this run.

Then save it.

Reopen and try to check what data was saved for each node. Only the nodes having processed small quantities will show what they have recorded into the .zip. The other one will give you "Loading port content" in the middle of a white screen.

Please have another look that way.


Whooops! You're entirely right! I tried a 100k line CSV file and now I can totally see and confirm your issue in every point!

This looks quite nasty! I can still be no bigger help at the moment, but just hope that there'll be an easy fix.

Hi qqilihq,

You have made my day, i would have not understood why this is wrong for me and ok for you :)

This bug makes me believe that Knime is now very risky on High Sierra since we have no guaranty that correct data is used from one node to the next one....

Until Knime specialists have a look, i move back to Microsoft world. 

Special thanks to you


> Until Knime specialists have a look, i move back to Microsoft world. 

I guess I qualify as a KNIME specialist but not necessarily as a Mac pro. Can someone share the knime.log file after reproducing the error?

I've just sent some detailed infos to you, Bernd. To keep this thread complete, here's just the Exception:

DEBUG Column Filter        0:2        Execute failed: Exception while accessing file: "/Users/blablabla/macOS_Loading_Test/File Reader (#1)/port_1/data.zip": invalid block type
java.lang.RuntimeException: Exception while accessing file: "/Users/blablabla/macOS_Loading_Test/File Reader (#1)/port_1/data.zip": invalid block type
	at org.knime.core.data.container.ContainerTable.ensureBufferOpen(ContainerTable.java:294)
	at org.knime.core.data.container.ContainerTable.size(ContainerTable.java:145)
	at org.knime.core.node.BufferedDataTable.size(BufferedDataTable.java:383)
	at org.knime.core.data.container.RearrangeColumnsTable.size(RearrangeColumnsTable.java:605)
	at org.knime.core.node.BufferedDataTable.size(BufferedDataTable.java:383)
	at org.knime.core.node.NodeModel.executeModel(NodeModel.java:652)
	at org.knime.core.node.Node.invokeFullyNodeModelExecute(Node.java:1128)
	at org.knime.core.node.Node.execute(Node.java:915)
	at org.knime.core.node.workflow.NativeNodeContainer.performExecuteNode(NativeNodeContainer.java:561)
	at org.knime.core.node.exec.LocalNodeExecutionJob.mainExecute(LocalNodeExecutionJob.java:95)
	at org.knime.core.node.workflow.NodeExecutionJob.internalRun(NodeExecutionJob.java:179)
	at org.knime.core.node.workflow.NodeExecutionJob.run(NodeExecutionJob.java:110)
	at org.knime.core.util.ThreadUtils$RunnableWithContextImpl.runWithContext(ThreadUtils.java:328)
	at org.knime.core.util.ThreadUtils$RunnableWithContext.run(ThreadUtils.java:204)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.knime.core.util.ThreadPool$MyFuture.run(ThreadPool.java:123)
	at org.knime.core.util.ThreadPool$Worker.run(ThreadPool.java:246)
Caused by: java.util.zip.ZipException: invalid block type
	at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:164)
	at java.util.zip.ZipInputStream.read(ZipInputStream.java:194)
	at org.knime.core.util.FileUtil.copy(FileUtil.java:266)
	at org.knime.core.data.container.CopyOnAccessTask.createBuffer(CopyOnAccessTask.java:191)
	at org.knime.core.data.container.CopyOnAccessTask.createBuffer(CopyOnAccessTask.java:157)
	at org.knime.core.data.container.ContainerTable.ensureBufferOpen(ContainerTable.java:292)
	... 17 more

Quick summary what we've learned so far. It's a problem during writing those zip files. This particular zip file is more a container of three files: two xml files (meta data) and one binary file (actual data). These different parts are written with different levels of compression, which is also supported by the zip standard/API. This seems to be broken on the new version MacOS now.

We have isolated the problem in a small standalone java program and will submit a bug report @ Apple.


Thanks a lot for the update. Hoping to get the fix soon :)

Kind regards



I have exactly the same problem with "Tabel write" and "Tabel read". If I write a table with about 25mb and I try to read it afterwards, I get confusing errors like:

Execute failed: invalid code lengths set

Execute failed: invalid distance too far back

Execute failed: invalid block type

The worst point is, that those files can not be read with another KNIME version on Windows. So it means, that the written files are kind of damaged, which is a big problem!



Yes indeed. But it's clearly a bug in High Sierra. We have created a minimal Java application that reproduces the problem on High Sierra but works fine on any other operating system including Sierra. We have opened a bug report at Apple but didn't get any response so far.


i also encounter problems that Ralph mentions. Sorry to say this but to be on the safe side we should stop using Knime on HIGH Sierra... In case on not signaled inconsistencies elsewhere.

Maybe this should be told to HIGH Sierra users so that they don't get into problems. And to Sierra users TO NOT UPGRADE NOW....

On my side, i am downgrading back to Sierra one of my Macs, but it's lot of work since we must erase the whole HD before installing Sierra : so all programs have to be installed again (not only Knime).

Would it be possible to get an estimate date from apple for a fix ?

Thanks in advance



We'll add a warning to KNIME's welcome screen that there are issue on the newest MacOS. 

We can't comment on when Apple releases a fix. It seems others have run into the same issue, too: https://stackoverflow.com/questions/46539453/tomcat-with-compression-enabled-causes-error-on-os-x-high-sierra (we believe it's the same issue: zip compression in java programs)

We are one step further, it's a bug in the zlib library that is shipped with High Sierra: https://github.com/madler/zlib/issues/305

This was remedied with this commit. Can someone verify?

Apparently that didn't do it. Can someone provide a sequence of zlib calls that reproduces the issue? Possibly from the example program in the issue on github?

The problem has been found, and it is in the JDK. Here is the fix, written by xuemingshen.