Database Table Creator Node

I was trying to create a table today and I keep getting this error

ERROR Database Table Creator 0:16 Execute failed: PARSE ERROR: Encountered “varchar” at line 1, column 50.
Was expecting one of:
“)” …
“,” …

SQL Query CREATE TABLE snd.rpruett.newtable (sgh_segment varchar(255) , customers integer );
^

[Error Id: bea1b775-a75b-4c4a-8513-493f0b9348e7 on mapr-node9:31010]

Does anyone know what this means?

Thanks!
Ray

Hi @ray_pruett -

This sounds like a type mismatch between your table structure, and the data you’re trying to populate the table with - but it’s hard to tell without seeing the workflow or the data. Something about the data in column 50 is not playing nice. Can you provide any additional information?

(Also, if you have access to KNIME 4.0, you might consider using the new DB nodes, which give you additional control over variable types.)

Hi ScottF,

It’s only a two column table with 12 rows.

The first column is a string (Varchar(255)) and the second column is an integer column highest value is 3816.

The database that I’m trying to create the table in is Apache Hive. I believe I have the appropriate jdbc driver. I can create a table using SQL in the database reader node.

Ray

Ray

Can you post the data and a sample workflow (or at least a screenshot)? I can try to recreate it using KNIME’s Local Big Data implementation.

EDIT: Corey made a good point that the column 50 here probably references the 50th character in your varchar string. You likely have an unescaped special character in your data.

Here’s a screen shot of the workflow

Here’s a screen shot of the table I’m trying to create

image

as you can see it’s not large. It was more to make sure I know how to create a table.

Not sure if it makes a difference but I’m using Drill and MAPR

Hmmm. Nothing about that data looks unusual. I guess it’s possible there could be a hidden TAB character or some other weird whitespace issue that wouldn’t show up in a screenshot.

Have you tried using the DB nodes - instead of the Database nodes - to see if this strange error persists?

Hi Scott,

The only other node I’ve tried is the Database Writer node.

Is there another one you recommend? can you share a picture of the node?

thanks,

Ray

Hi @ray_pruett,

Is this line in your original post your SQL query?

SQL Query CREATE TABLE snd.rpruett.newtable (sgh_segment varchar(255) , customers integer );

The error makes me think it’s an issue with your SQL statement, but oddly it says column 50, which is the middle of a word in that query.

No it is not an SQL query.

Ray

Hi @ray_pruett -

The new nodes I’m referring to are available in KNIME 4.0, and include the following you might want to try in your workflow (of course, there are many other DB nodes too):

These are all part of the new KNIME DB Framework.

I’m still on 3.7 in order to stay in sync with our server

Thanks,

Ray

Hi ScottF,

I have not been able to upgrade to the new 4.0 due to our current server version.

I’ve created a very simple workflow which is giving me the same errors. There are no special characters. At least not created by the code.

ERROR Database Writer 0:2 Execute failed: PARSE ERROR: Encountered “varchar” at line 1, column 44.
Was expecting one of:
“)” …
“,” …

SQL Query CREATE TABLE snd.rpruett.knime_test (fname varchar(255), lname varchar(255), age integer)
^

test to send.knwf (9.5 KB)

Hi @ray_pruett -

Let me check with one of our database developers.

Hello ray_pruett,
as far as I know Apache Drill only supports table creation via a select statement. For details see the Drill documentation´. These tables you can create with the Database Connection Table Writer node.
If you want to create a table in Apache Hive with the Database Table Creator you need to use the Hive driver e.g. using the Hive Connector to connect to Hive. For more details on this see the KNIME Big Data documentation.
Bye
Tobias

2 Likes

Thank you. I will try to use the Hive connector.

image

When using the DB loader, is the HttpFS Connection node required?

Thanks,
Ray

Hi Ray,
yes the DB Loader needs the HttpFS connection to upload the data to the HDFS file system prior moving it into the Hive table. However as far as I know the MapR implementation of httpFS is based on an old implementation from Cloudera which in some cases has problem with our client implementation.
Bye
Tobias

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.