I have to bring in a collection with 70 Million+ entries into KNIME from MongoDB database. We have been trying with MongoDB reader hooked with JSON to Table node. However, MongoDB reader keeps failing to read from the database. We have tried filtering the query and adding multiple such nodes. Still no luck. Is there any limit on amount of data that can be brought in using these node? Is there any better way to bring in collection with huge data into KNIME efficiently?
I would sincerely appreciate your assistance in this!!
Thanks and regards
My understanding of the issue is that you are trying to use a “MongoDB Reader” node to bring in 70M+ records through a “JSON to Table” node into KNIME for further processing. However, as you report, it is failing to read from the DB. If this is correct, please confirm and I will research the issue further.
Additionally, please provide the following for additional information/troubleshooting:
- What version(s) of KNIME Analytics Platform (AP), KNIME Server (KS), and – if applicable – KNIME Executor are being used?
- Is it failing to read from the db at all, or is it getting some amount through the reading process and then choking? If the latter, how far does it go before it dies?
- Are you able to provide DEBUG logs for the time period covering this issue?
This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.