I am using Hive along with Database nodes and I am executing Database Reader and Database Connection Reader parallely when I get the error -->
Execute failed: org.apache.thrift.TApplicationException: ExecuteStatement failed: out of sequence response
This happens even when I was executing two Database Table Selector nodes in parallel after connecting them with Hive Connection.
In one of the posts I have noticed a solution to connect these nodes via variable ports so that they would run sequentially, where one node is queued for execution while first one is running. But is there a way to run them parallely, as they are on different set of Hive tables?
Earier I thought this is the problem for nodes if we run them in a single workflow, but as I see it now - it is failing even if there is another node running in different worklfow for the same HIVE connection. It seems at a time only one node can run on the given database connection.
yes this is a problem with the Hive driver that allows only a single request at a time. KNIME is caching the database connection accross workflows based on the jdbc url and user name which is why the same connection is used accross several KNIME workflows.
We are aware of this limitation and are working on a connection pool solution. As a workaround you can use different users to open up several connections to Hive or you could adapt the jdbc url by explicitly setting default values for parameters since the connections are cached based on the user name and jdbc url.
Thanks for the suggestion. We had a meeting with Bjoern and he suggested us to use a cloudera provided database driver for hive, instead of using default jdbc url configured in knime. And this work around is working just fine.
But still thank you very much.
@tobias.koetter My assumption is this has been solved in the new releases for your DB nodes, and we don’t have to do this workaround anymore, is that correct ?
yes, the problem was fixed with version 2.2 of the big data connectors. However, this fix is independent of the new DB Framework which is still in preview and also works with the existing database framework.