Load and Write Data into Hive Corporate DB

Hi KNIME Community,

Does anyone know how I can get data from other sources into a Hive table? I am able to create the Table, but loading the data does not work. Any one have any simple way to load data into an already created table? Attached is a screenshot of my current issue.

@etorres182 with Hive and KNIME you will have to provide an upload folder in your HDFS file system. Then the DB Loader will create a Parquet file there and load that into a data table.

2 Likes

@mlauber71, thank you for that. Any chance you have an example of an HDFS node bringing in the Parquet file? Was not sure how that worked. Also, any chance you happen to know why the legacy HDFS works but not the new version?

I am receiving this error:
ERROR HDFS Connector 4:93 Execute failed: knox.bigred3.company.com:8443: Unexpected end of file from server

image

@etorres182 I have examples of using CSV, ORC and parquet files as basis for external tables from HDFS and using them into’ real’ big data files (managed if you like). This is basically what the DB loader does under the hood.

Also if you are interested in KNIME and Hive you might want to check out this workflow:

Concerning the error could you provide us with a log file in debug mode when this happens. With big data an knime and ? Cloudera in an enterprise environment often it might be about permissions.

Also right before the upload a Cache node sometimes smoothes proceedings.