In serverless architectures, its quite common to use a file storage service like Azure Blob Storage or Amazon S3 instead of a traditional web server. One of the main attractions of this approach is that this can work out a lot cheaper, as you pay only for how much data you store and transfer, and there are no fixed monthly fees to pay.
To get started with this approach, we need to create a storage account, copy our static web content into a container, and make sure that container is marked as public.
In this post, I’ll show how that can be done with a mixture of PowerShell and the AzCopy utility.
The first task is to create ourselves a storage account
Step 1 - get connected and pick the subscription we are working with
Select-AzureRmSubscription -SubscriptionName "MySubscription"
Step 2 - create a resource group in our preferred location
$resourceGroupName = "MyResourceGroup"
$location = "northeurope"
New-AzureRmResourceGroup -Name $resourceGroupName -Location $location
Step 3 - create a storage account and put it into our resource group
$storageAccountName = "mytempstorage" # has to be unique
New-AzureRmStorageAccount -ResourceGroupName $resourceGroupName -AccountName $storageAccountName -Location $location -Type "Standard_ZRS"
Step 4 - get hold of the storage key, we'll need that to call AzCopy
$storageKeys = Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName
$key = $storageKeys.value
Now, we want to copy our static web content into a container in our storage account. There are PowerShell commands that will let us do this file by file. But a super easy way is to use the AzCopy utility which you need to download and install first.
Next we need to specify the source folder containing our static web content, the destination address in blob storage, and the access key for writing to that container. We need some flags as well –
/S to recurse through folders,
/Y to confirm we do want to overwrite and
$azCopy = "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy\AzCopy.exe"
$websiteFolder = "D:\Code\MyApp\wwwroot\"
$containerName = "web"
. $azCopy /Source:$websiteFolder /Dest:https://$storageAccountName.blob.core.windows.net/$containerName/ /DestKey:$key /S /Y /SetContentType
You might think we’re done, but we do need to ensure our container is set to “blob” mode so that its blobs are publicly accessible without the need for SAS tokens. We can do this with
Set-AzureStorageContainerAcl, but that command works on the “current” storage account, so first we need to call
Set-AzureRmCurrentStorageAccount to specify what the current storage account is.
Set-AzureRmCurrentStorageAccount -StorageAccountName $storageAccountName -ResourceGroupName $resourceGroupName
Set-AzureStorageContainerAcl -Name $containerName -Permission Blob
Now we launch our website, and we should see it running in the browser, downloading its assets directly from our blob storage container:
Start-Process -FilePath "https://$storageAccountName.blob.core.windows.net/$containerName/index.html"
The next step you’d probably want to take is to configure a custom domain to point to this container. Unfortunately, Azure Blob Storage doesn’t directly support us doing this (at least not if we want to use HTTPS), but there are a couple of workarounds. One is to use Azure Functions Proxies and the other approach is to use Azure CDN. Both will add a small additional cost, but its still a serverless “pay only for what you use” pricing model and so should still work out more cost effective than hosting with a traditional web server.
Hopefully this tutorial gives you a way to get started automating the upload of your SPA to blob storage. There are plenty of alternative ways of achieving the same thing, but you may find this to be a quick and easy way to get started with blob storage hosting of your static web content.