Does JSON to Table Node limit JSON length

I have a csv file which has 3 column.
Using the ‘Chunk loop start’ I’m reading 400 rows at a time. But when I view the JSON, it ends abruptly. I’m also attaching the ‘Table to JSON’ setting which I’m using.

So is ‘Table to JSON’ limited by length?

Hi @nikhil_benny , I would say that this is more likely to be due to the rendering/display from the node, than that it is actually limiting the length of the JSON produced.

In KNIME 5.2.2, I used a data generator to generate 2 million rows of data, then passed this to Table to JSON. I had to increase the amount of memory available to KNIME as on the first test I ran out of java heap space (you’ll know if that happens because an error message pops up!), but having done that, it ran through.

I didn’t see mine “cut off” the output though, but I’d think the presence of the three dots at the end of your screenshot are likely to be the render saying “and there’s more”, than that it has abruptly cut it off.

Of course to prove this, there is no way I could view 2 millions rows of JSON output in a KNIME table directly, so I output the json to a file, using Strings to Binary Objects and then Binary Objects to Files.

I was then able to view the output using vscode:

So, if you have any doubts, you may wish to test it in a similar way.

Edit: I increased my java heap space for KNIME to 20G (32GB system), and had it generate 5 million rows. This generated a 1.1GB JSON file with all 5 million rows :slight_smile:

1 Like

Thanks a lot for taking out time to reply. I was facing errors while using the JSON for the API call, namely invalid format of data. So I assumed this was the cause. To verify, I converted the JSON back to a table and it showed the right number of entries. Thanks again.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.