Posted in:

If you are using the static web hosting capability of Azure blob storage, then you'll probably want a way to automate uploading your files to the storage account. One good option is the AzCopy utility, but I don't always have that installed.

Fortunately, the Azure CLI now has the capability to bulk upload files with the az storage blob upload-batch command. You need to upload these into a container called $web (so make sure you surround that with single quotes in PowerShell). And you specify the name of the folder containing the files to be uploaded. In my case, it's a Vue.js application that builds to a folder called dist

$sourcePath = "dist"
az storage blob upload-batch -d '$web' -s $sourcePath --account-name $storageAccountName

And that's all there is to it. Super simple and convenient.

Note that this won't delete any existing files. For my static website, that doesn't really matter, although over time it does mean that for my application, the js and css folders will fill up with files with random names emitted by webpack. One solution to that would be to add a metadata property to each uploaded file with a unique identifier (e.g. by adding the extra argument --metadata version=20210108) and then you can delete any files that don't have that metadata tag afterwards.

Want to learn more about the Azure CLI? Be sure to check out my Pluralsight course Azure CLI: Getting Started.