I am reading a JSON file and then using JSON path, but it takes forever to open the configuration window, I literally have to wait 10 minutes between adding each path. The file is 500mb in size, is this too big? I have also set the Java heap space to 12gb. Any other ideas for this?
to see if I got it right? JSON Reader is working as expected and JSON Path node is the one which shows issues with opening configuration window? And then after adding one path you have to wait for another 10 minutes to add another one? It freezes or what? If you try with a smaller JSON file how does JSON nodes behave?
What KNIME version are you using?
Hi ipazin, sorry for the delay in getting back to you. Yes that is correct the issue is when opening the configuration window of the JSON Path Node. It takes forever to open up and then I am having to wait an extremely long time before being able to enter anything. Also at times it is freezing yes. I am using 4.4.0.
I am now remoting in to a desktop at work which is quite a bit more powerful (Ryzen 9, 32gb Ram). So far it seems to be working a lot better.
I have tried this on Alteryx and it is running in about 9 minutes. I really don’t want to use Alteryx though.
seems JSON Path node does require more memory when dealing with bigger JSON files so more powerful machine with more memory assigned should help. Additionally I have opened up a ticket (Internal reference: AP-17326) to see if this can be addressed. If/when any news someone will update this topic.
One more idea you can try is to limit rows in JSON Reader node and do the actual configuration in JSON Path node with a smaller dateset. Once configuration is done you can “unlimit” the rows in JSON Reader and run JSON Path without opening its configuration window.
Maybe a silly question, but how do I limit the rows being read by the JSON node?
Hi @Page0727 ,
no worries, in the “JSON Reader” Node, you’ll find a tab “Limit Rows”:
Thanks Lucas for getting back to me. My JSON Reader does not show that option, my guess is that it is a different version? I would hope I would have spotted an option like that, although nowadays who knows! Lol.
new JSON Reader with Limit Rows tab (and some other enhancements) is available from KNIME version 4.4.0. But you said you are on 4.4.? Maybe desktop at work needs update because from screenshot it looks like older JSON Reader (deprecated) node.
OK. I will look into the updates for the desktop. This is strange as all other deprecated nodes have said so in older workflows I have built. I appreciate the time you have spent in helping me resolve this.
So for whatever reason I now have the latest JSON Reader node with the row limit option. I am limiting the number of rows, but still finding the process to take a long time.
Unfortunately I am now getting the following error at the JSON Path node:
Execute failed: java.lang.OutOfMemoryError: Java heap space. The KNIME heap space is set to 28000m.
this is with limited number of rows? How much memory does machine has? It’s not recommended to put it too close to maximum.
The machine has 32gb of memory.
The row limit isn’t going to have an effect on this, it shows as only one row in the preview window. I did manage to get this working in the end, although it took about 4 hrs. to run, and uploaded into a database. I did notice a few strange things about the data though. There were millions of blank rows and about 37,000 populated rows, which would be correct. Here is the workflow I am using below, unfortunately I can not provide any of this data.
I am using the workflow below:
glad you found a way. Regarding blank rows it’s hard to tell without workflow itself but maybe you can deal with them inside Ungroup node as it has skip options.
This is a great idea. Sometimes the simplest way is the best.
Thanks for your help.
I had a very similar problem than teh one explained, let me copy a post I’ve written in another conversation
my experience trying to just ungroup a Json file:
Json read + ungroup–>doesn’t work even if I don’t have to select any info, I just want everything
Json read + Json Path + ungroup–> works
If Json path has “$…asterisk” works but gives lots of stranges results after the correct ones
If Json path has “$.asterisk” works well
From the beginning I’m conscient that I just want to take all the info from the Json file and ungroup it, so super simple.
This has also solved the memory issues (probably it has drastically reduced the size of the files)
If you find any explanation for it I’m really interested I spent about 10 hours to find this simple change
This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.