DB Loader - Can I change permissions of the temporarily created file ?


I’m having trouble giving the hive user READ access to the temporary file created by “DB Loader” (as my own user), for it to be able to run the subsequent “LOAD DATA” command on it & populate my table.

I’m using a kerberized Cloudera env, and have in HDFS:

/tmp/ hdfs supergroup drwxrwxrwx+

If I specify /tmp/ as the TargetFolder of the “DB Loader” node, I get:

ERROR DB Loader 0:62 Execute failed: [Cloudera][HiveJDBCDriver](500051) ERROR processing query/statement. Error Code: 1, SQL state: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask. org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=READ, inode="/tmp/knime2db5315350654609561406.parquet":alex:supergroup:**-rw-r-----**

The temp file is created with -rw-r----- but I need at least -rw-r--r-- because the hive user and I do not share any groups.

  • I know of the HDFS File Permission node, but not sure if it can help here
  • I thought of Flow Variables inside “DB Loader”, but also not sure if something like “targetFolder_Internals” could somehow do the trick

Any help would be greatly appreciated.

Thank you,

Hi @aconstantin,

may be a default ACL with the expected permissions on a new sub directory of /tmp helps?


1 Like

Hi @sascha.wolke,

Many thanks for your suggestion – that solved it (and I should have figured). It was down to:

hdfs dfs -setfacl -R -m default:user:hive:r-- /tmp

All the Best,

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.