I'm really pleased to announce that my latest Pluralsight course has been released. In Durable Functions Fundamentals I show you everything you need to get started with developing and debugging durable workflows locally, how to implement patterns such as fan-out fan-in and waiting for human interaction, and how to deploy and monitor your Durable Functions online.
To celebrate the launch here's a quick rundown of my top 10 reasons why you should use Durable Functions for your serverless workflows:
1. Express your workflows in code
2. Retry activities
Another great feature of Durable Functions is the support for retrying activities with back-off. That used to be awkward to implement in regular Azure Functions, but with Durable Functions it's trivially easy to add retries to both individual activity functions and to sub-orchestrations, giving your workflows much greater resilience to transient errors.
3. Run activities in parallel
In a typical multi-step workflow there are probably some activities that can be performed in parallel, but without a framework like Durable Functions, implementing the "fan-in" part of a "fan-out fan-in" pattern is complex and risks introducing race conditions.
With Durable Functions, running activities (or whole sub-orchestrations) in parallel is easy to accomplish, and combined with the power of Azure Functions to scale out, brings the potential for dramatic speedup in the end-to-end time your workflows take to complete.
4. Timeout workflows
Sometimes in a workflow, you're waiting for some kind of external event - maybe for a human to respond, or for an external system to send you a message, but you want to time out if you don't receive the event within a certain time period. Durable Functions makes this pattern straightforward to implement, allowing you to detect that an orchestration has got stuck, and take some kind of mitigating action to get it moving again, or to alert a system administrator.
5. State management for free
Workflows inherently have state associated with them - you need to know where you've got to in the workflow in order to decide what the next step is whenever an activity completes. Durable Functions transparently manages workflow state for you, meaning you can implement complex workflows without needing your own database at all. Of course, if you do have a database, you may still will want to update it during your workflows, to track the state of your business entities, but you don't need to manage the state for the orchestrations themselves.
6. Check on workflow progress with REST API
If you've ever built out a workflow with regular Azure Functions, you'll know it can be a real pain to work out where in the pipeline you currently are. This can be especially important for trouble-shooting if a workflow has got stuck. How far through did it get before failing?
With Durable Functions you can use the query status API to find out if an orchestration is still running or not. The query status API includes a showHistory flag to request to see the history of the workflow, allowing you to see exactly where it got up to before it got stuck or failed.
Even better, there is now a
SetCustomStatus API allowing you to store an arbitrary JSON object at any point in your workflow representing its current status. This is a great tool for diagnosing why an orchestration is taking longer than expected to complete.
7. Cancel workflows
If you've built a workflow out of regular Azure Functions chained together with queue messages, then cancelling it is not going to be easy. But with Durable Functions, the REST API includes a cancellation method making it really straightforward to abandon an in-progress workflow.
8. Serverless pricing model
Just because your workflows run for days at a time, doesn't mean you need to pay for days of compute. In fact, in many long-running workflows, most of the time is spent just waiting around. Because Durable Functions is built on top of Azure Functions, you get all the benefits of a serverless pricing model. You only pay for the time your functions are actually running and your orchestrator function invocations will all be extremely quick as they simply wake up, decide what the next step in the workflow is, and go straight back to sleep.
9. Versioning made easier
One of the hardest problems of implementing workflows is how to deal with versioning. If I make a breaking change to the workflow, what happens to in-flight orchestrations when I perform an upgrade? Durable Functions doesn't have a magic bullet to solve the problem, but it does provide several workable strategies for dealing with this issue. Currently I'm leaning towards just making a V2 version of my orchestrator functions and retiring the V1 orchestrator later once all old workflows have finished, but you can pick the versioning strategy that works best for you.
10. Develop and test locally
Finally, it's possible to develop and test your Durable workflows locally. You can get the full local debugging experience of stepping through orchestrator and activity functions, as well as examining the contents of your "task hub" (which can also be local if you are using the Azure Storage Emulator) using Storage Explorer.
When you do publish your workflows to Azure, then the Application Insights integration gives you access to rich and powerful querying capabilities on your Function App telemetry and logs.
If you're currently building workflows out of a series of Azure Functions triggering each other, then Durable Functions is a no-brainer. It really is a game-changer that makes development and management of your serverless workflows much easier. Do give it a try and if you're a Pluralsight subscriber then my new Durable Functions Fundamentals course will teach you the key concepts and provide lots of examples of the sorts of workflows you can build with Durable Functions.