hive to spark problems

Hi all

I have two knime tables I sent them to Cloudera hive via Hive loader with no problem. Then I connect a hive connector, I select each of the two new Hive tables and plug a Hive to Spark component with a Database Table selector. No problem with the first table, impossible with the other.

ERROR Hive to Spark 0:188 Execute failed: Job failure: "no such table List(hightest); line 1 pos 14"

You can find the tables in attachement.

Knime 2.12 (but I reproduced the same behaviour for the 3.1.1 version)

Spark executor 1.3

Has anybody an idea ?

Best regards,


Spark to hive give the expected results but launch the error : ERROR Spark to Hive 0:232 [org.apache.hadoop.hdfs.KeyProviderCache, Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!]

Best regards,



better write the query with the database prefix before a hive to spark.

Hi Fabien,

thanks for the info. You are right currently the Hive to Spark node and Spark to Hive node use the default database if you do not add the database prefix to the table name. We will have a look into this.



This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.