Storing JSON in Database

Hello,

We are doing a lot of API calls, API scrapes, etc, and get massive JSON files back in different complex formats. Parsing them is a lot of manual work. Today we:

  1. Collect them
  2. Store them on HD
  3. Extract what we need and save into MYSQL
  4. Later go back to JSON if we need more

Some of them are super complex and we don’t need it all (now). We debating now to just store them into a DB cause the file operations are slow.

What is the best way to do this?

Thank you

Hi @nxfxcom,

Based on my experience, if you are dealing with large JSON files, it would be wise to consider MongoDB or DynamoDB instead of saving the whole thing into table cells. So, in case you need to access part of the JSON, you don’t need to call everything back and parse it locally, instead you just query for what you require. To work with MongoDB in KNIME, you could use the KNIME MongoDB Integration extension and for DynamoDB, we have KNIME Amazon DynamoDB Nodes.

Alternatively, PostgreSQL with JSONB type is also an efficient way of storing JSON.

3 Likes

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.