Connection to Azure Gen2 data lake with SAS key instead of Access Key

Hello. I am trying to use the Azure Blob Store Connection node to read data that are stored with Azure Data Lake Storage Gen2. For a different, pre-Gen2 data lake, I was able to connect successfully by configuring the node with a Storage Account and an Access Key. However, for the new Gen2 data lake, I do not have an Access Key. Instead, I have a SAS (shared access signature) token. That SAS token is not a Base64 value because it contains characters such as & : %. When I tried to use it as the Access Key, I got an error.

The “Credentials” option in the configuration of the Azure Blob Store Connection node is activated only if a “credentials” node of some type precedes it. I assumed that I could use a “credentials” node to specify the storage account or endpoint (the ABC part in https://ABC.blob.core.windows.net) and the SAS token, then connect that node to the Azure Blob Store Connection node, but I have not been successful. I am not sure which Credentials node to use.

Previously, I used the Azure Blob Store File Picker to locate the specific file with the data that I needed. Will that also work with Gen2 files?

Thanks in advance for your help!

1 Like

Hi,
Currently that node does not support SAS tokens, but we already have a ticket in the backlog for it. Its ID is AP-7478. I have added a comment there to push it a bit!
Kind regards,
Alexander

2 Likes

Alexander, how can I check the status of AP-7478? our company is also interested in SAS token authorization

Hi,
Unfortunately there is no public access to our internal bugtracking. The ticket number will turn up in the changelog, though.
Kind regards,
Alexander

Hi @JMH829 and @timoschenko.j,
Actually, the feature is available for our new Azure Blob Storage Connector node in KNIME 4.3! As Azure DataLage gen2 is built upon Azure Blob Storage, it should work to use that node.
Kind regards,
Alexander

3 Likes

Hi @AlexanderFillbrunn I was able to use the new Azure Blob Storage Connector using KNIME 4.3.0! I did the following:

Microsoft Authentication node:
Configured with Authentication mode Shared access signature (SAS) authentication (Azure Storage only), with Blob service SAS URL specified as https://name_of_my_data_lake.blob.core.windows.net?sv={remainder of my SAS token}

connected to

Azure Blob Storage Connector node:
Configured in Settings with Working directory: /directoryname/subdirectory1/subdirectory2/ (the beginning / and ending / are important)

connected to

CSV Reader node with an added File System Connection port on the left
Configured to read a specific CSV file in subdirectory2 with additional settings to specify the layout of the CSV file

Thanks to you and the KNIME developers who made this possible!

3 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.

Hello,
I’m very happy to announce, that we have just finished the implementation of the new Azure Data Lake Storage Gen2 Connector node. If you want to give it a try download the nightly build and install the KNIME Azure Cloud Connectors extension. If we do not encounter any bigger problems the node will be officially released with version 4.3.3 of the KNIME Analytics Platform.
We are eager to hear your feedback.
Bye
Tobias

6 Likes