DB Writer to Snowflake

Hi, I created a KNIME simple workflow that takes data from Postgres and pushes to Snowflake.
There is currently no staging layer, like S3.
I’m just writing the records to an empty table in Snowflake below. Please ignore the error in the last node. For 10M rows, it is done by 5% in 30mins, so it takes ages!

What should I do to increase the speed of writing?
Is it better to create a table with DB Table Creator, then writing data?

@ksenkaya You might want to take a look at this example about how to transfer data between databases by streaming the content.

1 Like

Hi,
unfortunately the JDBC driver is very slow when it comes to writing massive amounts of data. The fastes way to load data into Snowflake would be to use the COPY command. You could first load the data into S3 or Azure Blob Store or Google Cloud Storage depending on where your Snowflake instance is running and then point the COPY command to the written file path. The COPY command can be executed using the DB SQL Executor node.
We plan to have a dedicated Snowflake Connector in the future which will come with support for the DB Loader node that will use the copy command.
Bye
Tobias

5 Likes

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.

Hello @ksenkaya and others that might be interested,

with KNIME version 4.4.0 there is dedicated Snowflake Connector node which should offer faster data transfer. Check here for more info:

Br,
Ivan

1 Like