United Kingdom: +44 (0)208 088 8978

Reactive Web Applications with Azure WebJobs and Farmer

This week Ryan shows how quickly you can get up and running using Azure WebJobs to trigger your code in response to other Azure services such as Blobs and Queues, even integrating with Event Grid in a few easy steps.

We're hiring Software Developers

Click here to find out more

Update 3 / 11 / 21 - The Webjobs SDK has been updated to use the newer Azure Storage SDK, so some details of this post are slightly different. You have to register queues and blobs separately and the CloudBlockBlob is replaced with a BlobClient.

Modern web applications often need to integrate with a variety of cloud services.

This communication often takes place over a REST API.

Whilst this is fine in many scenarios, the request-response model has limitations.

1. In order to know that anything has changed, we need to ask.

If our system needs to react in near-realtime, this means we need to write code to continously poll external systems.

2. There is no central coordination / resilience

Nothing ensures that an instance of your application will respond to an event, if it happens to be down / unavailable.

We may also want to schedule background tasks that occur in the future or on a regular schedule.

Again, if you want to ensure that all tasks are completed, this can be challenging.

Triggers

A solution to this problem is using reactive 'triggers' which will execute your functions based on some external criteria.

These triggers are usually orchestrated by an external service. This ensures that

  1. Only one instance is triggered at a time.

  2. Failure to confirm that the trigger has been processed can lead to it being attempted again (known as at least once delivery).

Azure provides a range of triggers, such as

  • Blobs - Get a connection to a file when it is added to Blob storage.

  • Queues - Subscribe to a topic and receive push messages.

  • Timers - Execute a task on a regular schedule.

Check out the complete list for more info.

There are two ways of attaching these triggers to functions, which look and act very similarly.

1. Azure Functions

A Function app is 'serverless' and is run on demand. This means you only pay for what you use, and so can be an ideal way to perform occasional background tasks in many scenarios.

2. WebJobs

A WebJob is very similar to a Function App, however it runs in the context of an existing application, for example a web app hosted in AppService or even a console app in a VM.

To add them to your application you just add the WebJobs SDK package to your ASP.NET app, register the services at Startup and include the functions in your app source code.

WebJobs Examples

I will assume here you are starting with, and are familiar with, the SAFE stack V2 template. That said, the SDK guidance is broadly similar across any ASP.NET application.

Azure

The WebJobs SDK is backed by Azure Storage services. This means you will need to configure a Storage account under your Azure subscription, and make its connection string available to your application.

You should not paste this connection string into your application or any configuration files. It should be loaded at runtime from a secure location such as Azure KeyVault. For more information, see my previous blog, although in a moment I will show you an even easier way to set up KeyVault than that example!

You could set the storage account and app settings up by hand through the web portal, but the SAFE template comes with Farmer which makes this process extremely easy and repeatable.

  1. Update Farmer
dotnet paket update Farmer -g Build
  1. Open build.fsx in the root of your solution.

Hint: If the intellisense in this file isn't working, build the project using dotnet fake build, then edit the file in any way to make it spring to life.

  1. Find the Azure build target.

  2. Add a new expression above the others (webApp, arm etc) which defines the storage account you wish to create

    let storage = storageAccount {
        name "myuniquestoragename"
    }
  1. Add couple of lines to the existing webApp expression, and update the default name to something unique.
    let app = webApp {
        //... other stuff
        name "myuniquewebappname"
        setting "StorageConnectionString" storage.Key
        use_keyvault
    }

The setting line tells Azure to make the connection string available in your web app's settings dictionary.

The use_keyvault line tells it to actually store the connection string in KeyVault, so you only see a secure link to it in plain text in the portal.

  1. Finally add the storage account to the arm expression at the bottom
    let deployment = arm {
        //... other stuff
        add_resource storage
    }

That's it. now if you run the Azure target, providing you have the Azure CLI installed, your deployment should take place. Once it is complete, you can visit the Azure Portal to see that they have appeared as expected.

If you see any errors, read them carefully - they will almost certainly be errors with the resource names you have chosen - either that they are already in use or that you have used invalid / too many characters etc. Names will ultimately form URLs and so are fairly restricted. Also, the webapp name will be used in the auto KeyVault provisioning, and KeyVault names cannot contain hyphens.

dotnet fake build -t Azure

You can test most of these things out locally using the Azure Storage Explorer to browse files and either Azurite or the older Storage Emulator to emulate Azure - more details below.

App Setup

  1. The first thing you will need to do is install the WebJobs SDK.
dotnet paket add Microsoft.Azure.WebJobs.Extensions -p Server
dotnet paket add Microsoft.Azure.WebJobs.Extensions.Storage -p Server
dotnet paket install

From Paket 6 you can omit -p Server as long as you're in the Server folder

  1. In the Server module, find the application expression at the bottom, and add the following line
let app =
    application {
       // ... other stuff
        host_config configureHost
    }

Now we need to add the configureHost function above.

open Microsoft.Extensions.Hosting
open Microsoft.Extensions.Configuration

let configureHost (hostBuilder : IHostBuilder) =
    hostBuilder
        .ConfigureAppConfiguration(fun configBuilder ->
            let cfg = configBuilder.Build()
            // Grab the storage connection from your app settings
            let storageConn = cfg.["StorageConnectionString"] 
            // This is required for WebJobs
            configBuilder.AddInMemoryCollection (dict ["AzureWebJobsStorage",storageConn]) |> ignore
        )
        .ConfigureWebJobs(
            fun builder ->
                builder // You may not need all of these, depending on the triggers you are using in your project
                    .AddAzureStorageCoreServices() 
                    .AddAzureStorage()
                    .AddTimers()
                    |> ignore
            )

Timer Trigger

The WebJobs SDK will scan your assembly for any functions which are decorated with Trigger attributes, and wire them up for you.

These can just be normal F# functions. There is a limitation with that approach however.

Because these functions will be called outside of a web request, you have no HttpContext from which to resolve services and configuration settings etc.

The solution to this is to use a class and add your trigger functions as members. You can then inject any registered services etc that you need through the constructor.

In a new file, create a class to hold your functions. The exact names you choose for the class and functions don't matter - the wiring is done via the attributes you set.

namespace WebJobs

open System
open Microsoft.Azure.WebJobs
open FSharp.Control.Tasks.V2

type WebJobs ((*inject services / IConfiguration etc here*)) =

    member this.TimerFired ([<TimerTrigger "0 0/1 * * * *">] timer:TimerInfo) = task {
        printfn 
            "Timer fired at %O, next occurence is %O" 
            DateTime.Now 
            (timer.Schedule.GetNextOccurrence(DateTime.Now))
    }

The [<TimerTrigger>] attribute takes a cron expression which dictates the schedule. The example will trigger every 1 minute, starting from the first minute of every hour.

The TimerInfo is provided by the WebJobs SDK when it calls your method.

Local testing

Provided you have...

  1. Made sure one of the emulators mentioned earlier are running

  2. Set a "StorageConnectionString" setting with the value of "UseDevelopmentStorage=true" (see the blog I linked to earlier on config and secrets).

... then when you run your app locally you should see in the console that the method has been detected and registered, along with a printed list of the next few scheduled occurrences of the timer trigger.

dotnet fake build -t run

Blob Trigger

You may want to be notified when a new Blob has been added to your storage account, so that you can process it somehow.

  1. Add a container to your storage account to hold the blobs
let storage = storageAccount {
        //... other stuff
         add_private_container "blob-container"
    }

Add the following method to your WebJobs class.

    open Microsoft.Azure.Storage.Blob

    // This uses the container we added to our storage account using Farmer earlier.
    member this.BlobUploaded
        ([<BlobTrigger("blob-container")>] blob: CloudBlockBlob) = task {
            printfn "Blob created: %s" blob.Name
    }

In order to test this you will of course need to add a blob to storage using the Storage Explorer, either to a container you have added to your emulator with the name "blob-container" if running locally or to blob storage if you are working with a real deployment.

Queue Trigger

An Azure Storage Queue can persist messages and attempt to re-deliver them until it receives an acknowledgement of success. This gives you the 'at least once' guarantee that is often required.

These messages can be posted from other applications, or commonly from Azure services themselves by subscribing a storage queue to an Event Grid topic.

As an example, you may want to make absolutely sure you receive a blob created notification. The blob trigger we just created is at most once, which means that in some circumstances you may not receive the event.

Let's improve our resilience by adding a Storage Queue and an Event Grid instance to the Azure build target we created earlier. Again, this is just a couple of lines of code.

Whilst Queue and Blob triggers work locally with the emulator, EventGrid does not. This means you can only test it 'for real' in Azure.

    let storage = storageAccount {
        //... other stuff
        add_queue "blob-created-events"
    }

    let events = eventGrid {
        topic_name "blob-created"
        source storage
        add_queue_subscriber storage "blob-created-events" [ SystemEvents.Storage.BlobCreated ]
    }

    let deployment = arm {
        //... other stuff
        add_resource storage
        add_resource events
    }

Now we can add a handler for the messages to our WebJobs class.

    member this.QueueMessageReceived ([<QueueTrigger("blob-created-events")>] blobCreatedEvent:string) = task {
        // You could use the FSharp JSON type provider to deserialise this event easily.
        printfn "Blob created queue message received: %s" blobCreatedEvent
    }

After a configured number of failures and retries, the message will be delivered to a 'poison' queue which is your final chance to log it or handle it in some other way.

This will have the same name as your main queue but with "poison" appended to it and is automatically set up for you. You just need to handle the messages.

Add this to your WebJobs class

    member this.PoisonMessageRecived ([<QueueTrigger("blob-created-events-poison")>] blobCreatedEvent:string) = task {
        printfn "Blob message handler failed: %s" blobCreatedEvent
    }

Conclusion

I hope you found this brief tour of WebJobs to be useful. They are powerful tools which can help you build resilient and scalable web applications.

The official docs are pretty difficult to wade through and tend to focus on C#, but they work perfectly well with F# as you have seen.

Linking Azure resources and applications together with Event Grid, Storage Queues and WebJobs is easy when you have great tools like Farmer, which allows you to simply describe your infrastructure topology - it takes care of the heavy lifting of provisioning for you in a versionable, reliable and repeatable way.