I have some notebooks in Databricks, where my SQL is staying. So I am wondering can we connect to these notebooks, so that when I change some SQL, my workflow can also sync with this changes.

Hello @cuimaple ,
I’m not sure what exactly you are trying to do. However we have a component that uses the Databricks jobs API to execute notebook jobs: DatabricksNotebooks – KNIME Community Hub
Maybe you can use this as an inspiration on how to get connected to the Databricks AI from within KNIME. Maybe you can use the export API to get the SQL.
Bye
Tobias
Hi,
I am trying to extract the SQL codes storage in each notebook, then execute it in Knime.
Hello @cuimaple ,
sorry I wasn’t aware that the result is Base64 encoded. Also the SQL and text boxes are all returned as a single string. That is why I have created an example workflow that uses a component which extracts the individual blocks from a notebook and flags them as comment or not. Check it out her: DatabricksNotebookExtractSQL – KNIME Community Hub
Bye
Tobias
Hi,
That’s all right, from your hints, I also successfully decode the Base64 with python node after it. Your example for databricks API helped me a lot. Thanks!
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.
