Connecting to databricks

Hi everyone,

I’ve just setup Server Small on AWS, and I’m struggling to connect to Databricks - I’ve done it many times before on Analytics Platform so I know the connection string, etc, all works.

I believe the databricks driver is setup correctly (for I used to get an error telling me it can’t find it, but now I don’t), but now I get the following error:

…Your current server license does not allow you to execute the following nodes: “Create Databricks Environment”.

This made me think it’s an extension issue, so I’ve tried installing the various extensions but to no avail. The latest error I’m getting is as follows:

!MESSAGE Unknown option https://update.knime.com/community-contributions/trusted/{version_exe}

Any help would be must appreciated. I suspect it’s a fairly simple thing to do but I must have spent hours on this!

FYI the detailed steps I’ve done are below.

Thanks,

Dom


  1. Installed the driver to ./srv/knime_server/config/client-profiles/executor/
  2. Updated the executor.epf file with the lines related to the database driver (the bottom 5 lines):
\!/=
/instance/org.knime.workbench.core/logging.logfile.location=true
/instance/org.knime.workbench.core/database_timeout=7200
/instance/org.knime.python2/condaDirectoryPath=/home/knime/python/anaconda
/instance/org.knime.python2/defaultPythonOption=python3
/instance/org.knime.python2/python3CondaEnvironmentName=py3_knime
/instance/org.knime.python2/python3Path=/home/knime/python/anaconda/envs/py3_knime/bin/python3
/instance/org.knime.database/drivers/Databricks/database_type=databricks
/instance/org.knime.database/drivers/Databricks/driver_class=com.simba.spark.jdbc.Driver
/instance/org.knime.database/drivers/Databricks/paths/0=${profile:location}/SparkJDBC42.jar
/instance/org.knime.database/drivers/Databricks/url_template=jdbc\:spark\://<host>\:<port>/default
/instance/org.knime.database/drivers/Databricks/version=2.6.0
  1. Set the executor preferences by following instructions in https://docs.knime.com/latest/server_admin_guide/index.html#introduction

3a. Running systemctl edit knime-executor.service

3b. Adjusting the environment variable to Environment='KNIME_EXECUTOR_PROFILES=-profileLocation http://127.0.0.1:8080/<WebPortal Context ROOT, most likely "knime">/rest/v4/profiles/contents -profileList executor'

  1. When restarting and running the Web Portal I got an error which I assume means it’s an extension issue?

Your current server license does not allow you to execute the following nodes: “Create Databricks Environment”.

  1. Using the AWS Marketplace guide https://docs.knime.com/2020-12/aws_marketplace_server_guide/index.html#install-extensions and a recent forum post https://forum.knime.com/t/unable-to-install-extensions-on-aws-server-small/34838 I’ve tried to install the extensions.

5a. I know I need to install the following ones:
- org.knime.features.bigdata.databricks.feature.group
- org.knime.features.bigdata.commons.feature.group
- org.knime.features.bigdata.commons.windows.feature.group
[FYI the second two are pre-requisites for the databricks extension to work]

5b. I stop the KNIME server using sudo systemctl stop knime-server

5c. On attempting to install the first extension through sudo -u knime /opt/knime/knime-4.4.1/knime -application org.eclipse.equinox.p2.director -nosplash -consolelog -r https://update.knime.org/analytics-platform/{version_exe}, https://update.knime.com/community-contributions/trusted/{version_exe} -i org.knime.features.bigdata.commons.feature.group -d /opt/knime/knime-4.4.1 I get the following error:

CompileCommand: exclude javax/swing/text/GlyphView.getBreakSpot
Installation failed.
Unknown option https://update.knime.com/community-contributions/trusted/{version_exe}. Use -help for a list of known options.
!SESSION 2021-10-03 18:46:25.662 -----------------------------------------------
eclipse.buildId=unknown
java.version=11.0.10
java.vendor=AdoptOpenJDK
BootLoader constants: OS=linux, ARCH=x86_64, WS=gtk, NL=en
Framework arguments:  -application org.eclipse.equinox.p2.director -r https://update.knime.org/analytics-platform/{version_exe}, https://update.knime.com/community-contributions/trusted/{version_exe} -i org.knime.features.bigdata.commons.feature.group -d /opt/knime/knime-4.4.1
Command-line arguments:  -os linux -ws gtk -arch x86_64 -application org.eclipse.equinox.p2.director -consolelog -r https://update.knime.org/analytics-platform/{version_exe}, https://update.knime.com/community-contributions/trusted/{version_exe} -i org.knime.features.bigdata.commons.feature.group -d /opt/knime/knime-4.4.1

!ENTRY org.eclipse.equinox.p2.core 4 0 2021-10-03 18:46:27.099
!MESSAGE Unknown option https://update.knime.com/community-contributions/trusted/{version_exe}. Use -help for a list of known options.
There were errors. See log file: /opt/knime/knime-4.4.1/configuration/1633286785807.log

Hi Dom,

As you can see here in the license matrix the big data integration is a KNIME Server Large only feature. Is this the license type you are currently using as this would be required to use the integration on KNIME Server?

Beside this your setup looks valid, you got the error message in step 5c. because you have to replace the {version_exe} placeholder with the used main version, in your case 4.4, so the right command would look like:
sudo -u knime /opt/knime/knime-4.4.1/knime -application org.eclipse.equinox.p2.director -nosplash -consolelog -r https://update.knime.org/analytics-platform/4.4, https://update.knime.com/community-contributions/trusted/4.4 -i org.knime.features.bigdata.commons.feature.group -d /opt/knime/knime-4.4.1

But please check first if you already on the KNIME Server large license, the error message suggests that this is not the case.

Cheers,
Michael

1 Like

Hi Michael, I did not spot that I needed KNIME Server Large to connect to databricks, I assumed that because I could do it in Analytics Platform it would work - thanks for clarifying.

I’m afraid, however, that KNIME Server Large is way out of my price range so I suspect I may have to use other software for this use case.

Thanks anyway,
Dom

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.