Is there any way to get KNIME to read very large files or will there always be a limit?

I am trying to read a very large SDF file and get the following error: Execute failed: Required array length 2147483639 + 311 is too large.

Hi,
Which node are you using to read the file?
Kind regards,
Alexander

1 Like

Hi, thanks for the response! SDF reader node

Hi @zhuma,
I think this will not easily be possible. 2147483639 seems to be a Java-specific safe-bound when handling arrays: Why does Java StringBuilder use 2147483639 as safe bound? - Stack Overflow. Working around that in the KNIME code would be pretty tedious, I imagine. Could you split up your file into multiple parts and process them iteratively?
Kind regards,
Alexander

2 Likes

Thanks! Actually I think there ended up being an error in my files because I fixed that which in turn resulted in my large files being read somehow.

Actually, do you know where I can change this:

ERROR SDF Reader 3:56 Execute failed: The partition of the temp file "C:\Users" is too low on disc space (98MB available but at least 100MB are required). You can tweak the limit by changing the “org.knime.container.minspace.temp” java property.

Hi @zhuma,
Generally, KNIME’s temp directory is set via the Java property -Djava.io.tmpdir=<your-desired-temp-dir>. I am not sure why this node uses a different one. You set the property, you can modify the knime.ini file in the KNIME installation folder. You need to add a line at the end of the file:

-Dorg.knime.container.minspace.temp=<your-desired-temp-dir>

Maybe you can reroute -Djava.io.tmpdir as well, if you have a larger partition than C:.
Kind regards,
Alexander

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.