Error running query with DB Query Reader node

Hi,

I am using Create Databricks Environment node to connect knime to a database in Databricks. It works very well but when I tried to run a second node (DB Query Reader with SQL query) I got the following error: Execute failed: Error running query: org.apache.spark.SparkException: Job aborted due to stage failure: Total size of serialized results of 675 tasks (10.0 GB) is bigger than spark.driver.maxResultSize (10.0 GB). I am using KNIME Analytics Platform v4.1.4.

Could you please help me with this error? The objective is obtaining KNIME data table with the result from the database query.

Thank you in advance.

Hi Livia,
this is an exception in the Spark runtime. For further information have a look at the Databricks Knowledge Base Article.
Another thing that you can try if you are using the DB to Spark node is to increase the number of partitions by adapting the partition settings in the node dialog.
Bye
Tobias

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.