I am trying to read a very large SDF file and get the following error: Execute failed: Required array length 2147483639 + 311 is too large.
Which node are you using to read the file?
Hi, thanks for the response! SDF reader node
I think this will not easily be possible. 2147483639 seems to be a Java-specific safe-bound when handling arrays: Why does Java StringBuilder use 2147483639 as safe bound? - Stack Overflow. Working around that in the KNIME code would be pretty tedious, I imagine. Could you split up your file into multiple parts and process them iteratively?
Thanks! Actually I think there ended up being an error in my files because I fixed that which in turn resulted in my large files being read somehow.
Actually, do you know where I can change this:
ERROR SDF Reader 3:56 Execute failed: The partition of the temp file "C:\Users" is too low on disc space (98MB available but at least 100MB are required). You can tweak the limit by changing the “org.knime.container.minspace.temp” java property.
Generally, KNIME’s temp directory is set via the Java property
-Djava.io.tmpdir=<your-desired-temp-dir>. I am not sure why this node uses a different one. You set the property, you can modify the knime.ini file in the KNIME installation folder. You need to add a line at the end of the file:
Maybe you can reroute -Djava.io.tmpdir as well, if you have a larger partition than C:.
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.