excel, SQL, to HDFS hive data import

1 Like

Looks interesting. Maybe you could show us what kind of SQL code was used in the last node. If this is some insert command.

Maybe you could provide an example with the local big data environment.

1 Like

and inside the SQL Executor node you need to load the data

LOAD DATA INPATH ‘/user/t897420/tmp/z_stg_network.orc’ INTO TABLE schema.table;

notice inpath is the same as your orc writer.
this much is known to me using putty. but how to build it out in knime… NOOOOOOOO CLUE. documentation is literally a. non-existent or b.just running people around in circles

i hope this shaves HOURS off someone’s researching time
cheers

1 Like

Thank you I will try it with ORC - I have an example with an external table and Parquet files on the hub. Interesting thing with your approach is that it would load directly into a newly created structure.

I am curious about the preservation of longer strings via ORC. It has not occured to me that there was a problem with the DB Loader (like in the other example).

Anyway: thank you fro providing the example.

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.