I want to analysis the KNIME embedded recommendations file(locate at knime-workspace/.metadata/knime/community_recommendations.json), but do not know why it is slow even when I have two nodes…I also try JSON to Table node, it is also slow. BTW, I use the latest KNIME version v3.6.0.v201807100937, the original file is too big to upload.
WARN KNIMEApplication$4 Potential deadlock in AWT Event Queue detected. Full thread dump will follow as debug output.
WARN KNIMEApplication$4 Potential deadlock in AWT Event Queue detected. Full thread dump will follow as debug output.
WARN KNIMEApplication$3 Potential deadlock in SWT Display thread detected. Full thread dump will follow as debug output.
ERROR NodeContainerEditPart The dialog pane for node 'JSON Path 0:2' has thrown a 'OutOfMemoryError'. That is most likely an implementation error.
WARN KNIMEApplication$3 Potential deadlock in SWT Display thread detected. Full thread dump will follow as debug output.
ERROR JSON Path 0:2 Error loading model settings
My default setting is 2048m, increase memory would be help, but I do suppose it is not the root cause. The json file’s size is only ~30m. The error msg says That is most likely an implementation error.
I would first try to increase the memory as it works well for smaller files. Before going out of memory it can be very slow.
KNIME and its extensions can consume enough memory to make a 30MB file reading and JSONPath execution to cause OOM. (30MB is at least 60MB as String in Java, because of additional data structures it can easily take 240MB after parsing. JSONPath probably adds additional data structures for searching. Increasing the max memory to 3GB would be probably enough, but you might want/need more memory for the rest of the analysis.)
Thank your explanation, aborg:blush:
Honestly, I was first surprised by the KNIME node memory usage when I handle table data(even larger than this), so I’m still think there is some potential performance problem when handle the json file.