When the node guesses data types from the data input, it uses the plain
varchar for string columns, which raises en error (at least on Teradata connections), as this data type expects the size to be specified, i.e.
It would be better if it defaulted to
varchar(255), like in the old DB nodes, which I can then trim to the desired size, but at least users won’t get a confusing error.
the Exception is indeed confusing, but that is the Exception created from the jbdc driver.
We already have an open ticket for this issue, to only provide only valid types for the given database type. I will link your request there.
Until it is resolved, you are able to provide your custom type mapping. Obviously you already switched on dynamic type mapping. In the “Dynamic Type Settings” tab you are able to set the type for individual columns or for KNIME Types to your wanted SQL Type. You are able to set varchar(255) there.
Thank you, Mareike.
Just to clarify - with Dnymaic Types option ON, the data types are not editable.
What I have to do is to stwitch it on to import column metadata as it is, and then switch it off and edit manually.
we have just released KNIME 3.7 with a lot of changes in the new database framework including a fix for the problem you mentioned here. Please try out the latest version and let us know what you think.
This issue is resolved nicely in 3.7 - I can configrue database connection node with all the conversions and all downstream nodes just inherit it.
However, there now seems to be a minor inconvenience - I have to execute the upstream nodes to be able to configure the downstream ones, otherwise I get “Database connection not available” error. This could be a problem if any of the upstream nodes take a long time to execute, but so far this has been very rare.
the “Database connection not available” message comes for all nodes that require an input specification e.g. nodes that have a column selection drop down box where we can not avoid it.