Database Connection Reader Caching is Slow

I have a Database Connector to an Oracle Database using ojdbc8.jar

I am creating a metanode flow that allows users to see the schemas, tables, and columns within the connected database.

My current workflow connects to the Oracle database using an Oracle thin connection string; the same string I use in Oracle SQL Developer, Toad for Oracle, etc.

The Database Connector then connects to the Database Reader. I users to have the ability to read results from SELECT * FROM ALL_TABLES, and another flow for SELECT * FROM ALL_TAB_COLS.

My issue is with the caching speed. Using the same query and the same connection string in Oracle SQL Developer, Toad, etc. my results are returned instantly. I can read the results from the table.

Using the Database Reader the cache only fetches rows about 10-20 per second. For databases that could have 10,000 even 100,000 columns, tables, etc. this method is grossly under-performing. My query isn’t even complex.

What I am looking for is some method or solution that allows the Database Reader or some tool where users can see the results to quickly build. Ideally a query like this should only take a few seconds on a high speed network; not 20-30 minutes.

I have solved my issue.

When on an Oracle Connection, KNIME attempts to convert datatype = RAW to VARBINARY, and datatype = LONG to LONGVARCHAR.

In the ALL_TAB_COLS KNIME will attempt to cast the following columns as these data types:
DATA_DEFAULT will try to cast as LONGVARCHAR
LOW_VALUE as VARBINRARY
HIGH_VALUE as VARBINARY

If I SELECT ALL but the columns above, my download performance returns to normal.

The issue seems to be how KNIME attempts to convert RAW and LONG variables in an Oracle environment.

Furthermore, I test this theory using the KNIME Labs DB Connector. This is how I discovered the crippling data types. When I attempted to manually compensate by configuring the Type Mappings for LONGVARCHAR as String -> String, and VARBINARY String -> String; my download came to a crawl.

When I remove columns DATA_DEFAULT, LOW_VALUE, and HIGH_VALUE; my download speed returned to normal.

I did attempt to manually cast them as VARHCHAR(30) but Oracle refused to cast them expecting their values as LONG. When I tried casting as LONG, INTEGER, RAW, STRING, NUMBER – the queries failed to initialize.

My only solution at this time was to remove the offending columns from the query.

Hi there @Astrid ,

just wondering still experiencing same issues or?

Br,
Ivan