FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask

Hi Team, I am new to KNIME.

I used coworker’s workflow to study. The first two nodes are: Hive Connector, DB Query Reader. My coworker’s workflow runs successfully, then I changed Hive Connector’s credential and sslTrustStore to mine, it run successfully too, but the DB Query Reader failed with

ERROR DB Query Reader 0:6804 Execute failed: Error while compiling statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask

Do you know anything is wrong?

Thanks in advance :slight_smile:

@Annie510 welcome to the KNIME forum. Is it possible that your co worker would use a different tez execution queue? You might have to check the additional parameters.

Or would your admin have to set each user to be allowed to use the queue?

The entry could be something like:

tez.queue.name=my_queue_name

And you might ask your admin to take a look at additional logs from the big data server.

2 Likes

@mlauber71 Thank your very much for the explanation. I just start to use KNIME for a week, and knows nothing about it.

Do you know how to check the additional parameters, to see if my co worker use a different tez execution queue?

I am not sure if we have KNIME admin, and I just download it online and start to use. I think it is allowed to use queue, because all the workflow provided by him run successfully at my computer.

Thanks again, I searched for sometime, and could not find answer, so please don’t laugh at my silly questions.

@Annie510 the questions are not silly at all. In general big data systems can be quite complicated and to be honest the whole eco system as well as cloudera are still let’s say evolving.

You might want to check also the additional settings (under JDBC) in the node and see if you find settings like a queue that might be helpful.

We work with Hive and Impala nodes encapsulated in a meta node to make sure everyone would use the correct settings and if they change they would automatically be updated for all users. You would then have to handle authentication via Kerberos.

Maybe you can ask you Co worker and you admin to set something up.

Other than that you might post screenshots from your settings if they do not contain any sensitive information. @MichaelRespondek might also be able to weight in. You might provide a full log file to get more clues.

Most settings will be made on the side of the big data cluster so you have to make sure that knime and big data work together. So times in large enterprise environments special right have to be set per user to be allowed to impersonate a user via knime sever automation or access to HDFS file system.

2 Likes