I’ve been exploring the capabilities of the Azure CLI recently and today I’m going to look at working with blob storage. To catch up on previous instalments check out these articles:
- Create and Configure a VM with the Azure CLI
- Automate away your Azure Portal usage with the Azure CLI
- Using Queries with the Azure CLI
- Introducing the Azure CLI
Creating a Storage Account
The first thing we want to do is create a storage account. We need to choose a “sku” – whether we need geo-redundant storage or not. I’m just creating the cheaper LRS tier in this example. I’m also making a new resource group first to put the storage account in.
$resourceGroup="MyStorageResourceGroup" $location="westeurope" $storageAccount="mystorageaccount" # create our resource group az group create -n $resourceGroup -l $location # create a storage account az storage account create -n $storageAccount -g $resourceGroup -l $location --sku Standard_LRS
Next, we need to get the connection string, which is needed for all operations on blobs and containers:
$connectionString=az storage account show-connection-string -n $storageAccount -g $resourceGroup --query connectionString -o tsv
A convenient feature of the CLI is that you can set the connection string as an environment variable to save having to pass the
--connection-string parameter to every subsequent command.
Here’s how we do that in PowerShell:
$env:AZURE_STORAGE_CONNECTION_STRING = $connectionString
or if you’re in a bash shell:
Now we have a storage account, we can create some containers. The
--public-access flag allows us to set their privacy level. The default is
off for a private container, or you can set it to
blob for public access to blobs. There’s also a
container level which also allows people to list the contents of the container.
I’ll create a public and a private container:
az storage container create -n "public" --public-access blob az storage container create -n "private" --public-access off
Uploading a file into your container is easy with the
az storage blob upload command. You simply specify the name of the file to upload, the container to upload it into, and the name of the blob.
Here’s uploading a file into the public container and getting the URL from which it can be accessed:
# create a demo file echo "Hello World" > example.txt $blobName = "folder/public.txt" # upload the demo file to the public container az storage blob upload -c "public" -f "example.txt" -n $blobName # get the URL of the blob az storage blob url -c "public" -n $blobName -o tsv
If we upload a file to the private container, we’ll need to also generate a SAS token in order to download it via a URL. We do that with
az storage blob generate-sas, passing in an expiry date and the access permissions (in our case, we just need
r for read access).
$blobName = "folder/private.txt" # upload the demo file to a private container az storage blob upload -c "private" -f "example.txt" -n $blobName # get the blob URL $url = az storage blob url -c "private" -n $blobName -o tsv # generate a read-only SAS token $sas = az storage blob generate-sas -c "private" -n $blobName ` --permissions r -o tsv ` --expiry 2017-10-15T17:00Z # launch a browser to access the file Start-Process "$($url)?$($sas)"
More Blob Operations
Of course, there’s much more you can do with blobs from the Azure CLI, and you can explore the full range of options with
az storage blob -h. You’ll see that we can easily download or delete blobs, snapshot them, as well as manage their metadata or even work with leases.
Of course, for ad-hoc storage tasks, Azure Storage Explorer is still a great tool, but if as part of a deployment or maintenance task you need to upload or download blobs from containers, the CLI is a great way to automate that process and ensure it is reliable and repeatable.