I am trying to load data into a Google Big Query table. I can successfully connect to my source data in a MSSQL Server and also successfully connect to my Google Big Query table. Everything runs fine until I get to the DB Loader and I get the following error
Execute failed: Consumption path could not be found from Local Date Time through DATE (INT32) to DATETIME.
I am knew to KNIME so any insight into this error would be helpful.
Error indicates that mapping from KNIME to BigQuery type for specific column is not found. Seems you have a Local Date Time column type in your KNIME Table and DATETIME column type in your BigQuery table. Is that correct?
There is Output Type Mapping tab in every connector node. Have you modified it or left with default mapping? Also have you tried DB Insert node?
Ivan - Thank you for the insight. I was able to work around using DateTime converter nodes.
Another BigQuery question. I am able to successfully load data from my local desktop. however when I move the workflow to an application server I get the following message when I try load using the DB Loader node, Execute failed: (âExitCodeExceptionâ):
I can connect to Google Big query just fine, just getting a load error. When I try DBInsert I get the following error: Execute failed: Error while adding row #(Row1), reason: [Simba]JDBC Null pointer exception.
Ideally I would like to use updates and inserts to load and update data after the initial bulk load.
Hm⌠What is application server? A KNIME Server or a stronger machine where you are running KNIME Analytics Platform? You installed driver same way as on your local machine?
Just a normal server, similar to a remote desktop. I was able to get the database to write using a DB loader after doing some updates to KNIME. Do you know if KNIME supports using DB Insert and DB Update with Big Query? I tried using DB Update and got this,
Update 0:56 Execute failed: Error while adding row #(Row0), reason: [Simba]BigQueryJDBCDriver Error executing query job. Message: Unparseable query parameter `` in type TYPE_TIMESTAMP, Invalid timestamp: ânullâ value: ânullâ
I would really like to use updates and inserts if possible. Here is a screen shot of my DB Update configuration
It does. From your error message seems you are not allowed to insert null values into TYPE_TIMESTAMP. Also keep in mind that Google BigQuery has restrictions for DML (insert, update, delete) statements. Check here more:
Is there a setting somewhere to allow NULL values to be passed within KNIME? Like I said the DB Loader node is successful. My BQ database is setup to allow NULL values for all fields.
Hello dpaoly,
this is unfortunately a bug in the JDBC driver from Simba that is provided by Google. The driver does not support NULL in prepared statements and this Unfortunately this bug exists already for quite some time. The DB Loader uses another API then the JDBC which is why it supports NULL values.
So the only thing I can suggest here is to ask Google support to support NULL values in the JDBC driver they provide for Big Query.
Bye
Tobias
One more question, is it possible to have a constant date populate for datetime fields? Looks like the only options for DateTime are nothing, next values, previous values, remove row.
Another related question, if the fields donât exist in my destination database but do in my source database is it possible to still load the data?
Currently not with Missing Value node. There is already a open ticket for it where Iâll add +1. (Internal reference: AP-8160). As a workaround you can use Rule Engine node (in case of multiple columns you have to use separate node for each column and you column(s) will be converted to string so you have to use String to Date&Time to convert it back) and itâs MISSING operator to input fix date. To do it all in one node you can use Column Expressions.
If you filter those fields (columns) then yes. Otherwise you have to expand your destination table prior to loading. (To my database operations knowledge )
Please feel free to open new topic for any not related (to Google BigQuery) question.