Posted in:

With Azure Blob Storage it's possible to generate a Shared Access Signature (SAS) with which you can allow a third party time limited access to read (or write) a specific file in blob storage. You can also grant access to an entire container.

I blogged several years back about how to create a SAS token to allow upload of a blob, but things have moved on since then. Not only is there a brand new Blob Storage SDK, but there is also a new way to generate SAS tokens without the need to have the storage account key.

User Delegation SAS

The "standard" way to generate a SAS token is to use the storage account key. However, this assumes that you have the storage account key. If you want to use "managed identities", which is something I recommend wherever possible as a security best practice, then your application does not have the storage key. This means we need another way to generate shared access signatures.

This technique is called a "user delegation" SAS, and it allows you to sign the signature with Azure AD credentials instead of with the storage account key.

In this post I'll show the code to generate a user delegation SAS URI with the .NET Storage SDK. And I also want to cover a few gotchas, around the lifetime of those tokens, and concerning how you can test this code running locally.

Generating a User Delegation SAS

The first step is connecting to storage using Azure AD credentials. The new Azure SDK makes this very easy with DefaultAzureCredential. This helper class basically tries a variety of techniques in order to source the credentials to access the storage account.

It first checks for environment variables, and if they are not present, it tries to use a managed identity (this is what you'd typically want to use in production if possible). But then it has a bunch of additional fallback options that are great for local development. It's able to use the credentials you logged into Visual Studio, Visual Studio Code or the Azure CLI with. So in most development environments, this should just work.

Here's how we use DefaultAzureCredential to create a BlobServiceClient

var accountName = "mystorageaccount";
var blobEndpoint = $"https://{accountName}";
var credential = new DefaultAzureCredential();
var blobServiceClient = new BlobServiceClient(new Uri(blobEndpoint), credential);

Now, let's create a simple test file we can grant access to:

var containerClient = blobServiceClient.GetBlobContainerClient("mycontainer");
var blobClient = containerClient.GetBlobClient("secret/secret1.txt");
if(!await blobClient.ExistsAsync())
    using var ms = new MemoryStream(Encoding.UTF8.GetBytes("This is my secret blob"));
    await blobClient.UploadAsync(ms);

Now we need to generate the shared access signature. The first step is to create a user delegation key.

Note that the key can be valid for a maximum of 7 days. You get an error if you request a longer duration.

var userDelegationKey = await blobServiceClient

Now we have the user delegation key, we can use the BlobSasBuilder and BlobUriBuilder helpers to generate a uri that can be used to access the file. Here I'm asking for 7 days access to this file. The lifetime of the SAS does not have to be the same as that of the user delegation key, but it cannot be longer. If you create a SAS URI with a longer lifetime than the user delegation key then you'll get a 403 error back.

var sasBuilder = new BlobSasBuilder()
    BlobContainerName = blobClient.BlobContainerName,
    BlobName = blobClient.Name,
    Resource = "b", // b for blob, c for container
    StartsOn = DateTimeOffset.UtcNow,
    ExpiresOn = DateTimeOffset.UtcNow.AddDays(7),

sasBuilder.SetPermissions(BlobSasPermissions.Read |

var blobUriBuilder = new BlobUriBuilder(blobClient.Uri)
    Sas = sasBuilder.ToSasQueryParameters(userDelegationKey,

var sasUri = blobUriBuilder.ToUri();

The SAS URI can then be used to download the file until either the SAS expires, or the user delegation key expires (whichever happens first).

Here's some simple code you can use to check the SAS URI you generated actually works.

var h = new HttpClient();
    var contentSas = await h.GetStringAsync(sasUri);
catch (HttpRequestException hrx)
    Console.WriteLine("FAILED TO DOWNLOAD FROM SAS: " + hrx.Message);

Testing locally

If you tried to follow along with the steps above, running locally and using DefaultAzureCredential, you may have found it doesn't work.

The first reason for this is that there is an additional step you need to do, which is to grant yourself either the "Storage Blob Data Contributor" or "Storage Blob Data Reader" role for the container you want to access. This might take you by surprise as being the "owner" of the storage account is actually not sufficient.

You can test easily enough if you have the required role with the following Azure CLI code. If you don't have the role, it will fail to check if a blob exists:

$ACCOUNT_NAME = "mystorageaccount"
$CONTAINER_NAME = "mycontainer"

# use this to test if you have the correct permissions
az storage blob exists --account-name $ACCOUNT_NAME `
                        --container-name $CONTAINER_NAME `
                        --name blob1.txt --auth-mode login

Granting ourselves the role can be automated with the Azure CLI and is a useful thing to know how to do, as you'd need to do this to grant your managed identity this role as well.

First we need to get the Azure AD object ID for ourselves. I did this by looking myself up by email address:

$EMAIL_ADDRESS = '[email protected]'
$OBJECT_ID = az ad user list --query "[?mail=='$EMAIL_ADDRESS'].objectId" -o tsv

Next we'll need the identifier of our storage account which we can get like this

$STORAGE_ID = az storage account show -n $ACCOUNT_NAME --query id -o tsv

This returns a string containing the subscription id, resource group name and storage account name. For example: /subscriptions/110417df-78bc-4d9d-96cc-f115bf626cae/resourceGroups/myresgroup/providers/Microsoft.Storage/storageAccounts/mystorageaccount

Now we can use this to add ourselves to the "Storage Blob Data Contributor" role, scoped to this container only like this:

az role assignment create `
    --role "Storage Blob Data Contributor" `
    --assignee $OBJECT_ID `
    --scope "$STORAGE_ID/blobServices/default/containers/$CONTAINER_NAME"

There was one final gotcha that I ran into, and meant my C# code was still not working. And that was because the AzureDefaultCredential was not selecting the correct Azure AD tenant id. Fortunately, it's possible to customize the Visual Studio tenant id, which finally allowed me to generate the user delegation SAS locally.

var azureCredentialOptions = new DefaultAzureCredentialOptions();
azureCredentialOptions.VisualStudioTenantId = "2300dcff-6371-45b0-a289-3a960041603a";
var credential = new DefaultAzureCredential(azureCredentialOptions);


Managed identities are a much more secure way for your cloud resources to access Storage Accounts but they do make some tasks like generating a SAS a bit more complex. However, I've shown here how we can assign the necessary role to our local user (or managed identity), and write C# code to generate a user delegation key allowing us to generate SAS tokens without ever needing to see the Storage Account Key. You are limited to only generating SAS tokens with a maximum lifetime of 7 days with this technique, but it's not really a good security practice to generate very long-lived SAS tokens, so this limitation is forcing you in the direction of more secure coding practices.

Want to learn more about the Azure CLI? Be sure to check out my Pluralsight course Azure CLI: Getting Started.


Comment by Garth

Hi Mark - thanks for the great post. I've been moving away from the account keys and over to generate SAS tokens via the user delegation key and found that I needed to also have the role Storage Blob Delegator to get the GetUserDelegationKeyAsync call to work. We're using the v12.10 SDK for blob storage.
Ignore that last bit, contributor role is all that's needed. Thanks!