Snowflake Data Loading via Database Writer

Hello,

I’m trying to load some big data tables to Snowflake via KNIME using the Database Writer Node. I did adjust the batch size but it still goes a lot slower than uploading the data via Snowflake Client with the same connection (15 min per 1 MM lines vs 2 min). I guess this has to do with the fact that the Database Writer uses a Insert when loading data instead of Copy.

Did anyone find a workaround on this?

thanks!
Lucian

Hi Lucian,
you could implement the copy command by yourself using a supported file writer e.g. CSV, Parquet or ORC to write the data to a local file. Then use the Remote File Handling framework to upload the file to a file system the Snowflake db can read from. Finally you can use the Database SQL Executor node to execute the copy into command.
Bye
Tobias

1 Like

@tobias.koetter - would you happen to know of a workflow example that demonstrates this orchestration through KNIME? Thanks!

Is it possible to use DB loader for mass loading (MMs lines) to Snowflake? I couldn’t setup the driver to make it possible. Did anyone manage to pull it off?

thanks!
Lucian

@lucian.cristian - the most performant way I found was to write to a Snowflake staging table in one node, and then copy/put into the database table. This requires some SQL coding, but if you run through the process in Snowflake, it gives you sample SQL which you’ll need to tweak slightly to work in KNIME. Hope this helps!

3 Likes