Posted in:

One of the great things about version 2 of the Azure Functions runtime, is that it runs on .NET Core, which means it is cross-platform. This is great for anyone wanting to use the Azure Functions Core Tools on a non-Windows platform, but it also opens up the possibility of running your Azure Function App in a Docker container.

Why Docker?

Now you might be wondering - why would you even want to do this? After all, the "consumption plan" is a superb way to host your Function App, bringing you all the benefits of serverless - you don't need to provision any infrastructure yourself, you pay only while your functions are running, and you get automatic scale out.

None of this is true if you choose to host your Function App in a Docker container. However, it does open the door to hosting in a lot more environments than were previously possible. You can use it to host Function Apps on premises or in other cloud providers for example. Or maybe you're using something like AKS for all your other services, and would just like the consistency of everything being packaged as a Docker container.

Creating a Dockerfile

The easiest way to create a Dockerfile for an Azure Function app is to install the Azure Functions Core Tools (you will need v2), and run the func init --docker command.

This will prompt you for what worker runtime you want - the choices are currently dotnet, node or java - choose dotnet if you're writing your functions in C# or F#, and node if you're writing in JavaScript.

This will create a new empty Function App, including a Dockerfile. The only real difference is based on what worker runtime we chose as there's a different base image used - for dotnet its microsoft/azure-functions-dotnet-core2.0 and for node it's microsoft/azure-functions-node8.

Here's the Dockerfile it creates for a node Function App. It simply sets an environment variable and then copies everything in to the home/site/wwwroot folder.

FROM microsoft/azure-functions-node8:2.0
ENV AzureWebJobsScriptRoot=/home/site/wwwroot
COPY . /home/site/wwwroot

I took a very slightly different approach for my C# Function App, allowing me to do a docker build from the root directory of my project, and copying the contents of the release build of my Function App. Obviously its just a matter of preference how you set this up:

FROM microsoft/azure-functions-dotnet-core2.0:2.0
ENV AzureWebJobsScriptRoot=/home/site/wwwroot
COPY ./bin/Release/netstandard2.0 /home/site/wwwroot

Build and run locally

Building the container is very simple - just issue a docker build command, and give it a name and tag:

docker build -t myfuncapp:v1 . 

And then to run your Docker container locally, it's again very straightforward. You might want to expose a port if you have HTTP triggered functions, and set up any environment variables to connection strings your function needs:

docker run -e MyConnectionString=$connStr -p 8080:80 myfuncapp:v1 

Try it out in Azure Container Instances

Once you've created your Docker image, you can easily get it up and running in Azure, with Azure Container Instances and the Azure CLI (check out my Pluralsight courses on ACI and Azure CLI for tutorials on getting started with these).

I've create a very simple Azure Functions demo app, and created a container for it which is available here. It implements a simple TODO API, and it also uses Azure Functions proxies to proxy through to a static webpage (hosted externally in blob storage) that can be used to test the API. It also stores its data in table storage, so it does need a storage account.

Here's some PowerShell that uses the Azure CLI to create a resource group and a storage account (to store the todo items), and then creates a container to run the function app. The container needs two environment variables - one to tell it the connection string of the storage account, and one to tell our proxy where to find the static web content, currently I've put it in a public blob container available at but that won't necessarily be available forever, so if you want to try this out yourself, you may need to copy the website's static HTML, CSS and JavaScript content from my GitHub repository to a public blob and set the proxy destination to point there.

# Create a resource group to store everything in
az group create -n $resGroup -l $location

# create a storage account
az storage account create -n $storageAccount -g $resGroup -l $location --sku Standard_LRS

# get the connection string
$connStr=az storage account show-connection-string -n $storageAccount -g $resGroup --query connectionString -o tsv

# create a container
# point it at the storage account (AzureWebJobsStorage) and at static web content (WEB_HOST)
az container create `
-n $containerName `
-g $resGroup `
--image markheath/serverlessfuncs:v2 `
--ip-address public `
--ports 80 `
--dns-name-label serverlessfuncsv2 `
-e AzureWebJobsStorage=$connStr `

# check if the container has finished provisioning yet
az container show -n $containerName -g $resGroup --query provisioningState

# get the domain name (e.g.
az container show -n $containerName -g $resGroup --query ipAddress.fqdn -o tsv

# check the container logs
az container logs -n $containerName -g $resGroup

# to clean up everything (ACI containers are paid per second, so you don't want to leave one running long-term)
az group delete -n $resGroup -y

If all goes well, here's what you should see. The Function App also has a scheduled function that deletes completed tasks every five minutes:


Want to learn more about how easy it is to get up and running with Azure Container Instances? Be sure to check out my Pluralsight course Azure Container Instances: Getting Started.