Databricks connection fails

Hi! I had a nicely working Databricks community connection, but now nothing works. I have tried to use old and new nodes, but the result is this:

Configurations and results:

LF legacy
ERROR List Remote Files (legacy) 10:45 Execute failed: No file or directory exists on path /FileStore/tables/.
DFSC
ERROR Databricks File System Connector 10:112 Execute failed: Server error: 401


ERROR Create Databricks Environment 10:113 Execute failed: /FileStore/tables

I have tried all of them with different directories like / and /FileStore with same results.

Also in some of the instructions it is said that we would need to install a new Simba Spark ODBC driver, but in that zip package there is no jar file.

Any ideas?
Thanks,
Lasse

Hello,

I have tested basic connection to our internal databricks, and am able to successfully browse folders/files with the list files/folders nodes.

Please try installing the new Databricks File System Connector node, and the List Files/Folders node and attempt to connect again.

I am also seeing a 401 error in your messages which would indicate that the login is incorrect. Can you attempt to login to your admin portal for the community databricks environment and confirm your login is correct?

Thanks,
Zack

Hi @lassease,

can you verify that you can login to the Homepage of the Community Edition with your credentials?

You need to download the JDBC drivers, not the ODBC drivers.

Cheers,
Sascha

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.