0 Comments Posted in:

One of the questions I frequently get asked by people who watch my Durable Functions Fundamentals Pluralsight course is whether you can use dependency injection with Durable Functions (as my demo app uses static methods). The answer is yes, and it's quite simple although there are a couple of considerations about logging that are worth pointing out.

In this post I'll give a quick overview of the main steps, and you can get more details on the official docs site if you'd like to dive further into the topic of dependency injection in Azure Functions.

UPDATE: I should point out that in this post I am using the in-process Azure Functions programming model using .NET Core 3.1, rather than the out-of-process .NET 5 model, because that does not support Durable Functions. In the future, the out-of-process model is going to support Durable Functions, but the out-of-process model uses a different approach for setting up dependency injection, so this tutorial does not apply for .NET 5 Azure Function apps.

Step 1 - Add NuGet references

First of all, if you've already created your Azure Function app, you need to add two NuGet references to your csproj file. These are Microsoft.Azure.Functions.Extensions and Microsoft.Extensions.DependencyInjection.

<PackageReference Include="Microsoft.Azure.Functions.Extensions" Version="1.1.0" />
<PackageReference Include="Microsoft.Azure.WebJobs.Extensions.DurableTask" Version="2.5.0" />
<PackageReference Include="Microsoft.Extensions.DependencyInjection" Version="5.0.1" />
<PackageReference Include="Microsoft.NET.Sdk.Functions" Version="3.0.13" />

Step 2 - Register Services at Startup

The next step is to create a Startup class derived from FunctionsStartup and override the Configure method. In here you can set up whatever dependencies you need with AddSingleton or AddTransient.

The example I show below also calls AddHttpClient to register IHttpClientFactory. And you could even register a custom logger provider here if you need that.

Note that we also need to add an assembly level attribute of FunctionsStartup that points to the custom Startup class.

using Microsoft.Azure.Functions.Extensions.DependencyInjection;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;

[assembly: FunctionsStartup(typeof(DurableFunctionApp.Startup))]

namespace DurableFunctionApp
    public class Startup : FunctionsStartup
        public override void Configure(IFunctionsHostBuilder builder)
            builder.Services.AddHttpClient(); // registers IHttpClientFactory
            builder.Services.AddSingleton<IGreeter>(_ => new Greeter());

Step 3 - Injecting Dependencies

Injecting dependencies is very simple. Instead of defining functions as static methods on a static class, just create a regular class with a constructor that takes the dependencies and stores them as class members. These can then be used in the functions themselves which are now instance methods.

In this example I'm using the injected IGreeter in an activity function. You can use dependencies in orchestrator functions as well, but remember that the strict rules of orchestrator functions must still be adhered to.

public class MyOrchestration
    private readonly IGreeter greeter;

    public MyOrchestration(IGreeter greeter)
        this.greeter = greeter;

    public string SayHello([ActivityTrigger] string name, ILogger log)
        log.LogInformation($"Saying hello to {name}.");
        return greeter.Greet(name);

And that's all there is to it. Although I did say I'd mention a few gotchas with logging.

Gotchas - injecting loggers

When you set up dependency injection in an Azure Functions project, you might be tempted to attempt to inject an ILogger using the class constructor, rather than having an ILogger as a parameter on every function. If you do this you'll run into a few problems.

First, you can't inject the generic ILogger - that doesn't get registered by default. Instead you have to inject an ILogger<T> - so ILogger<MyFunctions> for example.

public class MyFunctions
    private readonly ILogger<MyFunctions> logger;
    private readonly IGreeter greeter;

    public MyFunctions(ILogger<MyFunctions> logger, IGreeter greeter)
        this.logger = logger;
        this.greeter = greeter;

Second, the logs that you write to that ILogger<T> will get filtered out unless you update your host.json file to include logs for your namespace. In this example we're turning logging on for the MyDurableFunctionApp namespace.

  "version": "2.0",
  "logging": {
    "applicationInsights": {
      "samplingExcludedTypes": "Request",
      "samplingSettings": {
        "isEnabled": true
    "logLevel": {
      "MyDurableFunctionApp": "Information"

And third, when you use ILogger in an orchestrator function, the best practice is to use CreateReplaySafeLogger on the IDurableOrchestrationContext. UPDATE - I initially thought that this doesn't work with an ILogger<T> but I was mistaken. The code snippet below shows how to create a replay safe logger from the injected logger.

private readonly ILogger<MyFunctions> injectedLogger; // set up in the constructor

public static async Task<List<string>> RunOrchestrator(
    [OrchestrationTrigger] IDurableOrchestrationContext context)
    var outputs = new List<string>();
    var log = context.CreateReplaySafeLogger(injectedLogger);

    log.LogInformation("about to start orchestration...");

It may be that there are some other ways round these issues, so do let me know in the comments.

Want to learn more about how easy it is to get up and running with Durable Functions? Be sure to check out my Pluralsight course Azure Durable Functions Fundamentals.

0 Comments Posted in:

I've been a bit quiet here on my blog for the past few months, partly because I've had plenty of stuff outside of work keeping me busy, and partly because I've been working away at several updates to my Pluralsight courses.

The downside of creating courses about Azure is that it is an extremely fast-moving space. New features and services are constantly being added to the platform. The portal changes very frequently, so I've had plenty of demos that needed re-recording to show the updated UI.

One particularly nice thing about this round of updates is that some of the demos are shorter! It's nice to see the need for workarounds and complex pre-requisite setups being removed. I think the next few years of updates to cloud services needs to be more focused on simplification and ease of use, rather than adding in loads of additional features, and so it's pleasing to see that happening with many.

Anyway, here's a quick rundown of what's changed the six Pluralsight courses I've updated recently:

Durable Functions Fundamentals

I recorded the original version of my Durable Functions Fundamentals course just before version 2 of Azure Functions was released, and there were also a couple of small breaking changes to the Durable Functions extension itself. So this update was my largest, re-recording all the demos with the latest versions of Visual Studio, Azure Functions and the Durable Functions extension.

Durable Functions remains one of my favourite capabilities of Azure Functions, and is well worth considering if you are implementing any kind of long-running business workflows. It's great to see Durable Functions continue to improve and is now much easier to host in a containerized environment so you can benefit from it's capabilities even if you're not hosting in Azure.

Deploying and Manage Containers

My Microsoft Azure Developer: Deploying and Managing Containers covers a basic introduction to Docker, and then surveys all the many ways you can run containers in Azure. Many of the demo recordings have been updated to reflect changes in tooling, and base container image names.

Probably the biggest change is that Azure Service Fabric Mesh has been retired. In one sense this feels a real shame, as I felt it was a really great idea, providing a really easy to use serverless containerized microservices hosting platform. However, I think that the idea behind Service Fabric Mesh lives on, and we are seeing the emergence of similar platforms based on Kubernetes instead, that will hopefully soon offer all the benefits and simplicity that Service Fabric Mesh promised.

I also updated the Azure Kubernetes Service demos to reflect the many changes in the portal. One really nice simplification is that now you can view and manage Kubernetes resources directly within the Azure Portal - removing some of the complexities of setting up the old dashboard experience.

Create Serverless Functions

My Microsoft Azure Developer: Create Serverless Functions course also had a fairly substantial update. I focused particularly on updating the Visual Studio and VS Code demos, as well as the places where the portal was shown. I also updated the module on containerizing Azure Functions as the base image names have changed.

Microservices Fundamentals and Building Microservices

I also updated two of my microservices courses, Microservices Fundamentals and Building Microservices. These courses are intended to teach the principles of microservices rather than showing implementation specifics, so there were fewer changes needed.

However, the reference demo application eShopOnContainers has been updated and improved somewhat since I first recorded the course. So I have updated all the demo recordings to show a newer version of eShopOnContainers (the exact version I use is my fork on GitHub).

It's nice to see that the updated eShopOnContainers is simpler to work with. The docker build command is simpler, running it with WSL 2 on Windows requires less setup effort, and the integration tests run out of the box in Visual Studio 2019 now thanks to the built-in container integration that automatically starts up the dependent containerized services like RabbitMQ. One slight change of note is that when you access the homepage you need to visit it via host.docker.local instead of localhost or the identity microservice won't accept the redirect URL when you log in.

Implement Azure Functions (AZ-204)

My Microsoft Azure Developer: Implement Azure Functions course is a very short and focused course intended to provide the background information about Azure Functions necessary for taking the AZ-204 certification (Developing Solutions for Microsoft Azure). A new objective was recently added to the exam which is to implement custom handlers. To be honest, I was a little surprised this is expected knowledge for the exam as custom handlers are something that I think the majority of Azure Functions developers will not need to make use of. But it is useful to at least know they exist and what scenarios they can help with, so I added a short module explaining that.

What's next?

There are a few other Pluralsight courses of mine that would benefit from an update, as well as some ideas I have for new courses, so there may be some more courses released later this year. I have also submitted a few talk ideas for upcoming conferences - it's great to see that these are coming back, and I was particularly pleased to see that the South Coast Summit is being held very close to where I live - it would be great to see you there if you can make it along.

0 Comments Posted in:

In this post I want to show how you can use the cross-platform Azurite Azure Storage emulator running as a Docker container to develop Durable Functions locally.


You may know that for many years there has been an Azure Storage Emulator that can be used for local development of Azure Functions on Windows. This is great as it saves you from having to create a real Azure Storage account whenever you want to experiment with Azure Functions.

This is especially valuable if you are developing Durable Functions because they make use of a wide range of Azure Storage features including blobs, queues and table storage in order to implement task hubs.

More recently Azurite, which is cross platform, and open source has emerged as the successor to the old Windows-only storage emulator. The official documentation states:

Azurite is the future storage emulator platform. Azurite supersedes the Azure Storage Emulator. Azurite will continue to be updated to support the latest versions of Azure Storage APIs.

However, until recently Azurite has been lacking a few important features that make it suitable to fully replace the older storage emulator. In particular, Azurite did not support Table Storage until recently, making it unsuitable for use with Durable Functions.

The good news is that there is now preview support for Table storage in Azurite. You can check up in this issue to find out if it has gone GA yet. Fortunately it appears to be good enough already to support local development of Durable Functions (and apparently Logic Apps too).

Running Azurite as a Container

There are several ways you can install and run Azurite including a VS Code extension or just with a simple npm install -g azurite.

But I wanted to try it out as a container, so I started it with the following command. Note that I've included port 10002 which is the port used by table storage, which is now supported by Azurite, and necessary for Durable Functions.

docker run -d -p 10000:10000 -p 10001:10001 -p 10002:10002 mcr.microsoft.com/azure-storage/azurite

Note that in this example I'm not mounting a volume. I could do that by adding another parameter of -v c:/azurite:/data, which would allow my the contents of my emulated storage account to persist independently of the lifetime of the container. But I find that often I'm using storage emulator for a quick throwaway experiment, and so it's nice to be able to easily clean up once I'm done.

Testing it out with Durable Functions

You can try all this out very easily if you have the Azure Functions Core Tools installed.

  1. In an empty folder run func init and select dotnet as the runtime.
  2. Then run then func new and select the DurableFunctionsOrchestration template
  3. Make sure Azurite is running (you can use the docker run command shown above)
  4. Run the function app with func start
  5. Call the starter function URI to initiate a new workflow
  6. Call the statusQueryGetUri returned by the starter function to check on the workflow progress. You should see that the workflow has completed already.


If you're developing Azure Functions on a Mac or Linux, then it's definitely worth checking out Azurite. And even if you're developing on Windows and have been happy with the existing Storage Emulator, it's probably time to consider switching over to Azurite, as it is approaching feature parity, and going forwards is going to become the officially supported and recommended emulator.

Want to learn more about how easy it is to get up and running with Durable Functions? Be sure to check out my Pluralsight course Azure Durable Functions Fundamentals.