Wrong database timeout value - mapR driver

Hi,

Could you please review the following problem,
I am using BigData connectors for connecting HIVE running on mapR cluster.
So, I am using a mapR JDBC driver (hive-jdbc-1.2.0-mapr-1609.jar) and loading it through the knime Preferances page (See attached image).
First problem (not a big one): When push "Add directory" for adding the directoty of the mapR driver, but nothing happens.
I need to push "Add directory" again. And only on the second time, the directory is added to the list.
Now, it is working properly and use the driver for connecting Hive.

Second Problem: After I close Knime, and open it again, it seems, it is using wrong timeout when connecting to hive, and I get the follwoing error:
"WARN Hive Connector 0:376:276:360 Your database timeout (15 s) is set to a rather low value for Hive. If you experience timeouts increase the value in the preference page."
This is although the timeout is set to 14,400 seconds.

Could you take a look?

Many thanks.

Moran

knime-preferances-databases.jpg

Hello,

it might take sometime until the driver is listed once you hit the "Add directory" button especially if you add a directory with a lot of jars files. This is cause by KNIME going through all files to identify the JDBC driver.

You have set the correct database timeout. The node is checking this befor eecution and shows the warning if it is below 30 seconds. Have you reset and executed the node after changing the parameter?

Bye

Tobias

Hi Tobias,
Though, I reset the entire workflow, the timeout, after restarting KNIME is wrong.

I noticed the follwoing error on restart:
"ERROR KNIMECorePlugin Error while starting workbench, some setting may not have been applied properly: class org.apache.hadoop.hive.shims.Hadoop20SShims$2 has interface org.apache.hadoop.mapreduce.TaskAttemptContext as super class"

So, it probably failed to set the driver properties.
This is strange, because before restarting KNIME, the deriver seem to work properly, and with the correct timeout.

Any guidance?

Hello,

this seems to be a class loading problem. Have you added all necessary jars into the directory? The MapR JDBC driver requires a lot of companion jars which you need to add into the directory. The jars are listed here. You can also try the standalone jar which you can also download from the MapR homepage as far as I know.

Bye

Tobias

Yes,
I have all the jars in place, and Yet,

Please take a look at this thread:
http://stackoverflow.com/questions/29448222/found-interface-org-apache-hadoop-mapreduce-taskattemptcontext

java.lang.IncompatibleClassChangeError: class org.apache.hadoop.hive.shims.Hadoop20SShims$2 has interface org.apache.hadoop.mapreduce.TaskAttemptContext as super class
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.knime.core.node.port.database.DatabaseDriverLoader.loadClass(DatabaseDriverLoader.java:392)
at org.knime.core.node.port.database.DatabaseDriverLoader.readZipFiles(DatabaseDriverLoader.java:285)
at org.knime.core.node.port.database.DatabaseDriverLoader.readDir(DatabaseDriverLoader.java:254)
at org.knime.core.node.port.database.DatabaseDriverLoader.loadDriver(DatabaseDriverLoader.java:233)
at org.knime.core.node.port.database.DatabaseDriverLoader.loadDriver(DatabaseDriverLoader.java:224)
at org.knime.workbench.core.KNIMECorePlugin.initDatabaseDriver(KNIMECorePlugin.java:275)
at org.knime.workbench.core.KNIMECorePlugin.start(KNIMECorePlugin.java:260)
at org.eclipse.osgi.internal.framework.BundleContextImpl$3.run(BundleContextImpl.java:774)
at org.eclipse.osgi.internal.framework.BundleContextImpl$3.run(BundleContextImpl.java:1)

Hi,

is it possible that you have added the wrong jars? KNIME loads user defined drivers with an isolated URL class loader to prevent class loading problems such as this. Are you also using the correct name of the driver class? Some drivers ship both the Hive1 and Hive2 server classes.

Bye

Tobias

Many many thanks, Tobias.
This solved the problem.

I mistakenly added both Hadoop1 (hadoop-0.20.2-dev-core.jar) and Hadoop2 (hadoop-common-2.7.0-mapr-1607.jar) jars to the folder:

However,
It seem that adding ".bak" extension to the Haddop1 driver, didn't prevent its loading

A working list of jars for mapr on Hadoop2:

24.07.2016 15:15 62.050 commons-logging-1.1.3.jar
24.07.2016 15:15 2.189.117 guava-14.0.1.jar
25.07.2016 21:01 3.467.420 hadoop-common-2.7.0-mapr-1607.jar
24.07.2016 15:18 20.639.837 hive-exec-1.2.0-mapr-1607.jar
24.07.2016 15:19 92.643 hive-jdbc-1.2.0-mapr-1607.jar
24.07.2016 15:16 5.498.290 hive-metastore-1.2.0-mapr-1607.jar
24.07.2016 15:18 1.872.856 hive-service-1.2.0-mapr-1607.jar
24.07.2016 15:15 2.290 hive-shims-1.2.0-mapr-1607.jar
24.07.2016 15:15 719.304 httpclient-4.4.jar
24.07.2016 15:15 321.639 httpcore-4.4.jar
24.07.2016 15:16 313.686 libfb303-0.9.2.jar
24.07.2016 15:15 227.712 libthrift-0.9.2.jar
26.02.2015 17:43 481.535 log4j-1.2.16.jar
25.07.2016 21:45 32.127 slf4j-api-1.7.12.jar
25.07.2016 21:45 8.860 slf4j-log4j12-1.7.12.jar

Moran