DBInsert Passing Null Records

I have a process that’s been in place for months that uses DBInsert to load records into an Oracle global temp table that are then inserted into a production table. Once DBInsert has run it shows a “1” insert status for each record, which can be seen in the Input Data with Insert Status output.

However, if I query the temp table it has nothing in it, which has never happened before, and therefore is passing null records to the production table which is failing as a result. It’s configured to cache tables in memory, so I tried using “writes tables to disc”, but the same thing is happening. It recognizes the schema and table and isn’t returning an error, so why isn’t it loading the temp table?

Hi @kevinnay,

This is strange given that you said it was previously working, but I’ll ask some questions anyway as something may have been overlooked.

Is the Global Temporary Table (GTT) defined with “on commit PRESERVE rows”?

Alternatively are the DB Insert node and the DB Query/DB Reader (or equivalent ) within the same database transaction?

e.g.

In the following screenshots, the DB SQL Executor has been configured to create a GTT but it has not been explicitly set to preserve rows on commit.

CREATE GLOBAL TEMPORARY TABLE temp1(
    id INT,
    description VARCHAR2(100)
)

If the GTT is not set to preserve rows… this will work (rows can be read by DB Reader)

But this will not work (DB Reader produces an Empty table)

If, however it has been set to preserve rows, the DB Reader is able to read the rows written to the table:

CREATE GLOBAL TEMPORARY TABLE temp1(
    id INT,
    description VARCHAR2(100)
)
ON COMMIT PRESERVE ROWS

1 Like

Yes the temp table is set with ON COMMIT PRESERVE ROWS, which is why it worked previously. However, the DBAs recently migrated the user logins to new login profiles and it was unclear if the roles had changed, but in that everything else appears to work it doesn’t seem to be an issue. What I have experienced from time to time is that folks make changes to nodes and when KNIME is updated somehow those issues creep into code. I had a similar issue with Excel Writer, which to this day still doesn’t work I believe.

The question I have is that if DB Insert is holding those records in memory then it’s incorrectly displaying an insert flag, which should indicate the records are already in the temp table. However it’s followed by DB SQL Executor that then tries to pull the contents of the table and fails because there are no records.

This is the table format:

CREATE GLOBAL TEMPORARY TABLE schema.tablename (

RID NUMERIC GENERATED ALWAYS AS IDENTITY
(
START WITH 1 INCREMENT BY 1),
col0 datatype,
col1 datatype,
col2 datatype,
col3 datatype
) ON COMMIT PRESERVE ROWS;

ALTER TABLE schema.tablename ADD ( CONSTRAINT X_PK PRIMARY KEY (RID) ENABLE VALIDATE);

CREATE INDEX I_tablename ON tablename (col0, col1, col2, etc);

DB Insert picks up the schema and table name from a flow variable that’s passed to it and the node runs to completion without error. However it never loads the temp table. I suppose I should try changing it to a regular table, but it shouldn’t behave any differently and was working fine previously.

I ran a few more tests and DB Insert is just not inserting. No idea what’s happening, but the global temp table is there with no records and DB insert is showing it completed and that each row has a 1 insert status, so it’s anyone’s guess what’s going on with it.

This is the debug output indicating DB Insert thinks it’s succeeded in loading data into a temp table, but it never does because running the query manually results in zero records, but the temp table persists. It says the records are committed, but nothing is there so I’m trying to figure out how to troubleshoot something like this that was previously working.

DEBUG DB Insert 3:155 Acquiring connection.
DEBUG DB Insert 3:155 All the statements have already been closed.
DEBUG DB Insert 3:155 The connection has been relinquished.
DEBUG DB Insert 3:155 The managed connection has been closed.
DEBUG DB Insert 3:155 The transaction managing connection has been closed.
DEBUG DB Insert 3:155 Configure succeeded. (DB Insert)
DEBUG DB Insert 3:155 DB Insert 3:155 has new state: CONFIGURED
DEBUG DB SQL Executor 3:145 Configure succeeded. (DB SQL Executor)
DEBUG DB Insert 3:155 DB SQL Executor 3:145 has new state: CONFIGURED
DEBUG WorkflowEditor Workflow event triggered: WorkflowEvent [type=NODE_SETTINGS_CHANGED;node=3:155;old=null;new=null;timestamp=Jan 24, 2024, 1:14:50 PM]
DEBUG ExecuteAction Creating execution job for 1 node(s)…
DEBUG NodeContainer DB Insert 3:155 has new state: CONFIGURED_MARKEDFOREXEC
DEBUG NodeContainer DB Insert 3:155 has new state: CONFIGURED_QUEUED
DEBUG NodeContainer Project_v15 3 has new state: EXECUTING
DEBUG DB Insert 3:155 DB Insert 3:155 doBeforePreExecution
DEBUG DB Insert 3:155 DB Insert 3:155 has new state: PREEXECUTE
DEBUG DB Insert 3:155 Adding handler 7bcfdfd5-d2fa-4472-a6c4-22427bb4539c (DB Insert 3:155: ) - 47 in total
DEBUG DB Insert 3:155 DB Insert 3:155 doBeforeExecution
DEBUG DB Insert 3:155 DB Insert 3:155 has new state: EXECUTING
DEBUG DB Insert 3:155 DB Insert 3:155 Start execute
DEBUG DB Insert 3:155 KNIME Buffer cache statistics:
DEBUG DB Insert 3:155 11 tables currently held in cache
DEBUG DB Insert 3:155 35 distinct tables cached
DEBUG DB Insert 3:155 2 tables invalidated successfully
DEBUG DB Insert 3:155 23 tables dropped by garbage collector
DEBUG DB Insert 3:155 41 cache hits (hard-referenced)
DEBUG DB Insert 3:155 123 cache hits (softly referenced)
DEBUG DB Insert 3:155 2 cache hits (weakly referenced)
DEBUG DB Insert 3:155 1 cache misses
DEBUG DB Insert 3:155 Using Table Backend “BufferedTableBackend”.

DEBUG DB Insert 3:155 Acquiring connection.
DEBUG DB Insert 3:155 Create SQL statement as prepareStatement: INSERT INTO “SCHEMA”.“TEMP_TABLE” (“GA_ENROLLMENT_ID”, “GA_PROMOCODE”, “GA_REFERRER”, “GA_SOURCE”, “GA_MEDIUM”, “GA_EVENT”, “GA_DATE_HOUR”, “GA_CHANNEL”, “GA_SESSIONS”, “STATUS”, “ZIPCODE”, “CREATIONDATE”, “ADDRESSLINE1”, “ADDRESSLINE2”, “STATE”, “CITY”, “ZIP5”, “ENROLLMENTLANGUAGE”, “PLAN”, “FIRST_NAME”, “LAST_NAME”, “EXTERNAL_ACCOUNT”, “USACFORM”, “ADL”, “PREVIOUSENROLLMENTNUMBER”, “QUALIFYDATE”, “UNQUALIFYDATE”, “PROMO_TYPE”, “PROMO_REPORT_TYPE”, “GEOLOCATION”, “FIRST_CALL_DATE”, “ANNIVERSARY_DATE”, “IPADDRESS”, “NOCOMMISSION”, “FUNDINGTYPE”, “APPLICATIONTOKEN”, “ROOT_DOMAIN”, “HTTP_STATUS_CODE”, “HTTP_DESCRIPTION”) VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)
DEBUG DB Insert 3:155 Using table format org.knime.core.data.container.DefaultTableStoreFormat
DEBUG DB Insert 3:155 Committing the small-scale transaction…
DEBUG DB Insert 3:155 The small-scale transaction has been successfully committed.

DEBUG DB Insert 3:155 Closing the small-scale transaction…
DEBUG DB Insert 3:155 Enabling auto-commit mode…
DEBUG DB Insert 3:155 The small-scale transaction has been successfully closed.
DEBUG DB Insert 3:155 The small-scale transaction has been removed.
DEBUG DB Insert 3:155 All the statements have already been closed.
DEBUG DB Insert 3:155 The connection has been relinquished.
DEBUG DB Insert 3:155 The managed connection has been closed.
DEBUG DB Insert 3:155 The transaction managing connection has been closed.
DEBUG DB Insert 3:155 Using Table Backend “BufferedTableBackend”.
INFO DB Insert 3:155 DB Insert 3:155 End execute (2 mins, 58 secs)
DEBUG DB Insert 3:155 DB Insert 3:155 doBeforePostExecution
DEBUG DB Insert 3:155 DB Insert 3:155 has new state: POSTEXECUTE
DEBUG DB Insert 3:155 DB Insert 3:155 doAfterExecute - success
DEBUG DB Insert 3:155 DB Insert 3:155 has new state: EXECUTED

DEBUG DB SQL Executor 3:145 Configure succeeded. (DB SQL Executor)
DEBUG DB Insert 3:155 Project_v15 3 has new state: IDLE
DEBUG DB Insert 3:155 Buffer file (C:\Users\ID\AppData\Local\Temp\knime_Project_12675\knime_container_20240124_6869753549474515344.bin.snappy) is 2.98MB in size
DEBUG NodeContainerEditPart DB Insert 3:155 (EXECUTED)
DEBUG NodeContainerEditPart DB SQL Executor 3:145 (CONFIGURED)
DEBUG ExecuteAction Creating execution job for 1 node(s)…
DEBUG NodeContainer DB SQL Executor 3:145 has new state: CONFIGURED_MARKEDFOREXEC
DEBUG NodeContainer DB SQL Executor 3:145 has new state: CONFIGURED_QUEUED
DEBUG NodeContainer Project_v15 3 has new state: EXECUTING
DEBUG DB SQL Executor 3:145 DB SQL Executor 3:145 doBeforePreExecution
DEBUG DB SQL Executor 3:145 DB SQL Executor 3:145 has new state: PREEXECUTE
DEBUG DB SQL Executor 3:145 Adding handler 4e17839c-29a0-48f0-ab75-2be87d118ee6 (DB SQL Executor 3:145: ) - 48 in total
DEBUG DB SQL Executor 3:145 DB SQL Executor 3:145 doBeforeExecution
DEBUG DB SQL Executor 3:145 DB SQL Executor 3:145 has new state: EXECUTING
DEBUG DB SQL Executor 3:145 DB SQL Executor 3:145 Start execute
DEBUG DB SQL Executor 3:145 Acquiring connection.
DEBUG DB SQL Executor 3:145 All the statements have already been closed.
DEBUG DB SQL Executor 3:145 The connection has been relinquished.
DEBUG DB SQL Executor 3:145 The managed connection has been closed.
DEBUG DB SQL Executor 3:145 The transaction managing connection has been closed.
DEBUG DB SQL Executor 3:145 reset

ERROR DB SQL Executor 3:145 Execute failed: ORA-00942: table or view does not exist

Sorry @kevinnay , the forum didn’t notify me of your additional posts because I wasn’t tagged or directly replied to. I’ve turned up my tracking setting now for this thread.

I see you mentioned that maybe you should try to write to a regular table and see if that works. That was my thought too. Have you tried that?

A second thing to try… Write a valid sql INSERT statement that inserts a single row to the GTT using a DB SQL Executor node,

Eg
Insert into temptable (col1, col2) values (‘x’, ‘y’)

Possibly do a second one that writes to an actual table. See what happens.

By the way in that log at the end it has an ORA-942 error.

What is the table that doesn’t exist? Is that an expected message?

Also, is your schema really called “SCHEMA” and is your temp table called “TEMP_TABLE” or have you modified that in the log before posting here?

Sorry if I’m appearing to ask some dumb questions but knowing all the details is important to me when troubleshooting something like this, especially remotely. :wink:
Last question… (for now :slightly_smiling_face:) Can you write to the GTT outside of KNIME (eg sql developer)… Do you have that facility? Does it work?

Oh no… Another question, lol… Is the actual name of the temp table all in capitals in the database? Did you always have quoted identifiers switched on for this connection in KNIME?

Hi, I just subbed schema and table_name as placeholders. I can manually run and insert to that temp table without a problem. The table that doesn’t exist error is DB SQL Executor which handles some DML statements including inserting to a production table from the GTT, which it’s not finding.

I swapped in a regular table to see if that would work, but a temp table is ideal because you don’t have to delete it and knime doesn’t make conditional DML easy. It didn’t accept a private temporary syntax so I just went with a regular table. That didn’t work either, so it seems the issue is DB Insert, though I have no idea how to troubleshoot a compiled node that stops working.

I’d skip DB Insert, but I need to join the temp data to a joiner and that seems to be the best option unless I can craft something to pass to DB SQL Executor, but it doesn’t have a data insert port. Bottom line is I shouldn’t have to change anything since this process ran like clockwork for months, but I can’t explain why DB Insert isn’t actually writing to the temp table.

It’s logged in as me, but the only other thing I can think of is that the DBMS is rejecting the client, even though it’s logged in under my profile, but if it was failing then I would expect to see an error. Just seems buggy to me.

@kevinnay if you give db insert the name of a table that doesn’t exist, do you get an Oracle error message?

Yes, it won’t allow execution without a valid table. The temp table and the regular table are there and I can access them through a sql client, but DB Insert is lying to me when it says it’s inserting records because it isn’t.

This is what debug is telling me:

DEBUG DB Insert 3:134 Using table format org.knime.core.data.container.DefaultTableStoreFormat

Is there something going on with temp files or something else that might lead DB Insert to think it did something when in fact it didn’t? If it can’t insert it should fail, but I’ve already tested it manually and it works fine, so it’s not the code.

This is the rest of the debug output:

DEBUG DB Insert 3:134 DB Insert 3:134 doBeforePreExecution
DEBUG DB Insert 3:134 DB Insert 3:134 has new state: PREEXECUTE
DEBUG DB Insert 3:134 Adding handler 28c0b494-0837-4327-9872-d88befe9fb23 (DB Insert 3:134: ) - 47 in total
DEBUG DB Insert 3:134 DB Insert 3:134 doBeforeExecution
DEBUG DB Insert 3:134 DB Insert 3:134 has new state: EXECUTING
DEBUG DB Insert 3:134 DB Insert 3:134 Start execute
DEBUG DB Insert 3:134 Using Table Backend “BufferedTableBackend”.
DEBUG DB Insert 3:134 Acquiring connection.
DEBUG DB Insert 3:134 Create SQL statement as prepareStatement: INSERT INTO “SCHEMA”.“TEMP_TABLE” (“GA_ENROLLMENT_ID”, “GA_PROMOCODE”, “GA_REFERRER”, “GA_SOURCE”, “GA_MEDIUM”, “GA_EVENT”, “GA_DATE_HOUR”, “GA_CHANNEL”, “GA_SESSIONS”, “STATUS”, “ZIPCODE”, “CREATIONDATE”, “ADDRESSLINE1”, “ADDRESSLINE2”, “STATE”, “CITY”, “ZIP5”, “ENROLLMENTLANGUAGE”, “PLAN”, “FIRST_NAME”, “LAST_NAME”, “EXTERNAL_ACCOUNT”, “USACFORM”, “ADL”, “PREVIOUSENROLLMENTNUMBER”, “QUALIFYDATE”, “UNQUALIFYDATE”, “PROMO_TYPE”, “PROMO_REPORT_TYPE”, “GEOLOCATION”, “FIRST_CALL_DATE”, “ANNIVERSARY_DATE”, “IPADDRESS”, “NOCOMMISSION”, “FUNDINGTYPE”, “APPLICATIONTOKEN”, “ROOT_DOMAIN”, “HTTP_STATUS_CODE”, “HTTP_DESCRIPTION”) VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)
DEBUG DB Insert 3:134 Using table format org.knime.core.data.container.DefaultTableStoreFormat

@kevinnay In these situations I would look for actual Oracle error messages e.g. an ORA-nnnn code. Can you make the db insert attempt to insert invalid data such as putting null into a not null column. If you can get an Oracle error you can tell that it is interacting with the database rather than just “pretending”

DB SQL Executor returns ORA-01400 because it’s receiving null values to an INSERT statement that should have valid values. DB Insert indicates it has successfully inserted records into the table DB SQL Executor is attempting to pull records from, but all it’s getting are nulls. Honestly, I can’t tell which node is the culprit. Actually, that’s not true. DB Insert never gets records into the temp table because when I run it manually there are no records there.

If I run the SELECT portion of the INSERT query manually I get records back, so DB Insert is not inserting records into the temp table. DB SQL Executor can see all the flow variables and subsequently execute the queries without fail. It can also retrieve the metadata for the columns, so DB SQL Executor can see the table and return the columns in it.

The query logic itself wouldn’t prevent it from returning records so there has to be something going on with the nodes themselves. I even tried pulling a new instance of DB Insert and DB SQL Executor to see if by any chance that would reset something, but I got identical results. All my prior versions have the same configuration and all ran successfully.

The only thing I can figure is that KNIME updated and something was changed. Is there a node history I can look at? However about node versions? Can that be rolled back?

I just need to get the joiner node and the temp table data joined, which wouldn’t be an issue if I wasn’t using a joiner in the first place. I don’t see any other nodes for managing inserts so that’s a real show-stopper.

I tried something completely different. I pulled DB Insert into a new project and passed a random number to it and it worked. How is that possible when everything is identical?

4:150 Acquiring connection.
DEBUG DB Insert 4:150 Create SQL statement as prepareStatement: INSERT INTO “SCHEMA”.“DUMMY_TABLE” (“ID”) VALUES (?)
DEBUG DB Insert 4:150 Using table format org.knime.core.data.container.DefaultTableStoreFormat
DEBUG DB Insert 4:150 Committing the small-scale transaction…
DEBUG DB Insert 4:150 The small-scale transaction has been successfully committed.
DEBUG DB Insert 4:150 Closing the small-scale transaction…
DEBUG DB Insert 4:150 Enabling auto-commit mode…
DEBUG DB Insert 4:150 The small-scale transaction has been successfully closed.
DEBUG DB Insert 4:150 The small-scale transaction has been removed.
DEBUG DB Insert 4:150 All the statements have already been closed.
DEBUG DB Insert 4:150 The connection has been relinquished.
DEBUG DB Insert

image

Honestly I can’t tell which node is the culprit. DB Insert appears to be getting records into the non-GTT temp table, but DB SQL Executor is still getting nulls. Previously, DB Insert was not getting records into the temp table, so now it seems that DB SQL Executor is receiving null inputs.

This query is an insert from select statement, but debug shows it as accepting null values unless those are supposed to be input parameters. Where are the input values coming from?

DEBUG DB Insert 3:134 Create SQL statement as prepareStatement: INSERT INTO “SCHEMA”.“TEMP_TABLE” (“GA_ENROLLMENT_ID”, “GA_PROMOCODE”, “GA_REFERRER”, “GA_SOURCE”, “GA_MEDIUM”, “GA_EVENT”, “GA_DATE_HOUR”, “GA_CHANNEL”, “GA_SESSIONS”, “STATUS”, “ZIPCODE”, “CREATIONDATE”, “ADDRESSLINE1”, “ADDRESSLINE2”, “STATE”, “CITY”, “ZIP5”, “ENROLLMENTLANGUAGE”, “PLAN”, “FIRST_NAME”, “LAST_NAME”, “EXTERNAL_ACCOUNT”, “USACFORM”, “ADL”, “PREVIOUSENROLLMENTNUMBER”, “QUALIFYDATE”, “UNQUALIFYDATE”, “PROMO_TYPE”, “PROMO_REPORT_TYPE”, “GEOLOCATION”, “FIRST_CALL_DATE”, “ANNIVERSARY_DATE”, “IPADDRESS”, “NOCOMMISSION”, “FUNDINGTYPE”, “APPLICATIONTOKEN”, “ROOT_DOMAIN”, “HTTP_STATUS_CODE”, “HTTP_DESCRIPTION”) VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)

I just extended the test project I set up to see if I could insert to the same temp table and then insert from select with DB SQL Executor and it worked. It’s the same exact thing in the other project, but it continues to receive nulls. Do I have to rebuild the entire project again to get it to work? They’re virtually identical, yet the newer one works.

image

This has been resolved.

There should be an identity key in the production table, but our resident DBAs may have changed something because the previous versions did not require an explicit insert of the identity value from the temp table. That’s what was driving me nuts because the null error is the identity value receiving no value because it’s not part of the select and never was. Clearly something I have to take back to the folks who recently made changes to the environment.

Thanks for all the input.

2 Likes

Glad you got to the bottom of it @kevinnay.

It doesn’t feel right that such a thing causes effectively a “silent failure” within the workflow, but without seeing the actual environment it’s difficult to suggest how to avoid such a thing in future, nor whether there is any way it could be trapped.

Anyway, good you found the answer.