Problem with SQL types when using Database Writer in a loop

I am trying to upload a series of csv files to a mySQL database, where each csv file will be stored in separate tables in the database. So far I have developed a simple loop using "List files", "TableRow To Variable Loop Start", "CSV Reader", "URL to File Path (Variable)" (to generate SQL table name from csv file name), "Database Writer", and "Variable Loop End".

As it is a loop designed to process many files, I want to rely on the automatic SQL Types functionality in Database Writer to create SQL table columns of the appropriate type, to match the KNIME column types.

This works fine, except when the same column name appears in more than one file, with different types. For example:

  • column 'MYDATA1' in file1.csv contains integer values
  • column 'MYDATA1' in file2.csv contains string values.

In this case, Database Writer processes file1 correctly, but fails on file2.

WARN      DatabaseWriterConnection     Error while adding row #1 (R0), reason: Incorrect integer value: 'thisIsAStringValue' for column 'MYDATA1' at row 1

Although the KNIME column type for 'MYDATA1' in file2 is correctly assigned as String, the Database Writer node is creating an integer SQL column. It appears to be "remembering" the SQL Type for column 'MYDATA1' from the previous iteration on file1.

Is there a way to force Database Writer to flush the SQL Types and regenerate correctly each iteration? Or is this a bug maybe?

No, it's not a bug. The problem is that the Database Writer auto-configures itself using the current set of data columns and assigns a SQL type to it, see SQL Types tab in the Database Writer dialog. The only way I see is to fix the column type before executing the loop, or using a flow variable for the column type, for example generated with the Java Edit Variable node.