Setting the Create Databricks Environment Spark Version Correctly

I’m trying to setup the Create Databricks Environment node for use with the Databricks Spark context in Knime v4.7.2 and KNIME Databricks Integration 4.7.2.v202303211209 which only goes up to Spark v3.2. Databricks Spark is currently at v3.3.2. Does the version difference matter? If it does, when does the extension get updated? Is it updated in Knime v5.1?

I’ll answer my own question after some testing. Yes, it does matter. Knime 5.1 does have the Spark v3.3 available. After I upgraded to Knime v5.1 and set the Spark option to v3.3, I was able to connect to the Databricks Spark v3.3 cluster.

Hi @benpope -

Sorry for the delayed response - thanks very much for sharing the results of your testing!

Hi @benpope,

Thanks for sharing the results. In general, you have to find a matching minor version. Selecting Spark 3.3 in the KNIME node should work with all Spark 3.3.x clusters on Databricks (like 3.3.2).


1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.