Can we connect to Databricks notebooks?

I have some notebooks in Databricks, where my SQL is staying. So I am wondering can we connect to these notebooks, so that when I change some SQL, my workflow can also sync with this changes.
image

Hello @cuimaple ,
I’m not sure what exactly you are trying to do. However we have a component that uses the Databricks jobs API to execute notebook jobs: DatabricksNotebooks – KNIME Community Hub
Maybe you can use this as an inspiration on how to get connected to the Databricks AI from within KNIME. Maybe you can use the export API to get the SQL.
Bye
Tobias

1 Like

Hi,
I am trying to extract the SQL codes storage in each notebook, then execute it in Knime.

I have tried to used the export API, but only got some codes which is not familiar with me.

Hello @cuimaple ,

sorry I wasn’t aware that the result is Base64 encoded. Also the SQL and text boxes are all returned as a single string. That is why I have created an example workflow that uses a component which extracts the individual blocks from a notebook and flags them as comment or not. Check it out her: DatabricksNotebookExtractSQL – KNIME Community Hub

Bye
Tobias

3 Likes

Hi,
That’s all right, from your hints, I also successfully decode the Base64 with python node after it. Your example for databricks API helped me a lot. Thanks!

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.