Database writer error

Dear KNIMErs,

I run into a database writer error which seems like an arbitrary limitation:

	Execute failed: A table can only have one timestamp column.

Why, what's wrong with having many timestamps? :-(

Thanks
E

I'm pretty sure this message comes from the database and not from our node. What database are you using?

Thor,

I'm on SQL Server, which handles multiple datetimes just fine... I think the problem is KNIME defaulting to "timestamp" instead of "datetime" types... any easy fix proposals? :)

Thanks
E

You can set the SQL type of every column in the dialog.

I know (and it appears to work BTW), but KNIME using "timestamp" as factory default doesn't appear to be optimal, given certain DBs' constraints. :-) Apparently some MySQL versions have this limitation too...

-E

I agree with @Ergonomist. You have an option to set column type in the dialog but you can’t do it always. Let’s consider processing multiple tables in a loop or not knowing the columns for some reason.
It’d be great to have a timestamp type to datetime type transformation node but I haven’t found it yet.

Well, I’ve just learnt this issue had led me into serious troubles.

My task is to transfer several tables from PostgreSQL database to on-premises Microsoft SQL Server. I have got a configuration table which contains schemas and names of all the tables to be transferred. My workflow loads the configuration table and iterates over its rows. Then it reads source table and writes data to a temporary table where the data is copied into target table from later. If the table source table contains more than one column of DateTime type its insert to the temporary table fails with the error mentioned above. I’m not able to configure SQL Types in Database Writer node since structure of the tables the writer is expected to process varies. There’s no workaround to this because the Database Writer node controls the act of casting DateTime column to Timestamp type so theres nothing I can do with the data to prevent the node from failure. Moreover, I can’t configure the Database Writer node SQL Types dynamically using Flow Variables because the configuration only allows to set a type of a column referenced by its name.
image
This is why I can’t configure SQL Types for multiple tables processed in a loop.
I think the only way is to assemble and execute a DDL of the temporary table based on a structure of the source table.
Note: I decided to employ the intermediate insert into a temporary table for performance reasons. It speeded up the transfer of the data (4M records or even more) from number of hours to tens of minutes.

This is a really unfortunate “feature”. We have been trying to use Knime to move several tables from Oracle to a SQL Server in Azure in an automated fashion, but this choice of “timestamp” as data type for datetime fields make this impossible. Using the same method to move from Oracle to Oracle works really well.

If you change the timestamp field to binary(8) on SQL Server it works perfectly.

2 Likes