Hi, I created a KNIME simple workflow that takes data from Postgres and pushes to Snowflake.
There is currently no staging layer, like S3.
I’m just writing the records to an empty table in Snowflake below. Please ignore the error in the last node. For 10M rows, it is done by 5% in 30mins, so it takes ages!
Hi,
unfortunately the JDBC driver is very slow when it comes to writing massive amounts of data. The fastes way to load data into Snowflake would be to use the COPY command. You could first load the data into S3 or Azure Blob Store or Google Cloud Storage depending on where your Snowflake instance is running and then point the COPY command to the written file path. The COPY command can be executed using the DB SQL Executor node.
We plan to have a dedicated Snowflake Connector in the future which will come with support for the DB Loader node that will use the copy command.
Bye
Tobias