Posted in:

I've recently been doing some performance testing of various ways of uploading large files to Azure Blob Storage, and I thought it would be good to use the AzCopy utility as a benchmark, as it has a reputation for being one of the fastest ways of copying blobs.

In this post I'll explain the steps I took to upload a large file into blob storage with AzCopy.

Installing AzCopy

You can download the latest version of AzCopy here or if you have chocolatey installed, you can use that to install AzCopy V10 (which is the latest version at the time of writing).

choco install azcopy10

Logging in

AzCopy supports using SAS tokens, so if you know how to generate those then you can make use of them for operations. But it also has an Azure Active Directory authorization option, which I hadn't tried before so I wanted to give it a go.

You simply need to run the azcopy login command and follow the instructions. I found that I needed to explicitly set the tenant ID associated with my storage account as well.

azcopy login --tenant-id 117d490b-c0d6-4ae7-9f05-4e3dab298486

Storage account permissions

You will of course need to ensure that your user has permission to write to the container, which involves putting yourself into a role such as "Storage Blob Data Contributor". You can either set this up in the portal, or automate it with the Azure CLI. You can set the scope to just the container (as shown below) or to the storage account as a whole.

az role assignment create `
    --role "Storage Blob Data Contributor" `
    --assignee <email> `
    --scope "/subscriptions/<subscription>/resourceGroups/<resource-group>/providers/Microsoft.Storage/storageAccounts/<storage-account>/blobServices/default/containers/<container>"

Uploading the file

Once we are authenticated then its as simple as calling the azcopy copy command, specifying the source file and the target destination.

azcopy copy 'C:\Users\Mark\Documents\Very large file.mp4' `

It will produce a log file giving you detailed information about the upload, including all the calls it made, as well as statistics relating to performance.

Performance observations

The upload speed is of course dependent on your local network. However, I was impressed that AzCopy managed to go almost 10 times faster than some of the other upload techniques I was comparing it against. The way it achieves this is by breaking the file into blocks (it used 8MB block sizes in my example) and uploading them in parallel. Then it can use the put block list operation to connect all of those blocks up into a single block blob.

By the way, AzCopy can do much more than uploading a single file. You can upload multiple files, download files, and copy between storage accounts using it. It's definitely worth considering for any situations where you need to move large amounts of data into our out of Azure Blob Storage.