Table Reader node

I’m trying to upload data into HDFS using a TABLE READER > DB Loader.
All of my connections to HIVE works fine.

My problem is Table Reader keeps all the datatypes as STRING. And when you’re trying to load data into HDFS string is not very flexible - by default it is capped at string(255).

When the datatype in the Table Reader does not match the table in HDFS there is an error.
ERROR DB Loader 4:43 Execute failed: (“NullPointerException”): null

Example: Table Reader Col1 string(255) HDFS table Col1 varchar(1000)

You would first create an empty table with the right types of columns and then load the data


An alternative would be to upload the data to HDFS and create an external table.

1 Like

this is the best way to upload data into HDFS hive. my college created it for me and i’m posting it here b/c after hours an hours of searching and testing. hopefully this will save someone else some time.

4 Likes

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.