Introduction
Azure Functions provide a new means of developing applications quickly and efficiently which are able to run without the need to manage the underlying infrastructure. To achieve this, rather than developing applications which constantly poll for new events, we write our code so that it's executed by a trigger. For example when we're building an application which performs some background processing, we might create a console application which checks a queue for new messages. However with Azure functions we simply write the code which is executed whenever a new message is received. Then whenever a new message is received, that piece of code is invoked with the message from the queue in a reliable manner. Azure Functions isn't just limited to processing events arriving from a queue - we're able to execute code whenever a blob is updated in Azure storage, or on a fixed timer. It's even possible to configure our code to execute whenever a HTTP request is available at a given endpoint.
The Azure Functions model also simplifies the process of connecting to other Azure services, and offers the ability to bind directly to objects which exist within those other services, such as rows in Table Storage. This makes Azure Functions an ideal candidate for the glue which is used to hold larger applications together.
Whilst the Functions runtime provides a wealth of features which simplify the development of Functions code, we also have a wide array of tools and libraries available to us in F## when we're working with Azure. In this post we'll take a look at how we're able to create Azure Functions which interact with Azure Storage through the use of the Azure Storage Type Provider.
Creating a new Azure function application
Azure Functions in F
The Azure Functions runtime provides support for multiple languages and a number of development techniques. In this post, we're going to focus on how we can develop Functions locally using F#. As you might hope (given that F## is a functional-first language), Azure Functions has first-class support for F#. In fact, we can create Azure Functions in F# using a couple of different techniques:
- We can create them using a script-based approach using F##
.fsx
files - this is common practice in the F## world. - Alternatively, we can create a class library which contains the functions definitions. This is then compiled as normal, and hosted within Azure Functions.
In this case we'll use the class library approach, since it provides us with a few key benefits including reduced start up time and a simplified debugging process.
Tooling
Before we can develop functions locally, we need to install the Azure functions tooling. If you don't already have the functions tooling installed, then you can do so from npm by running the following command:
npm install -g azure-functions-core-tools
Next, create a new F## Class Library - this is where we'll write the code which powers our Azure Functions. Some of the Visual Studio tooling surrounding pre-compiled functions is still in flux and so requires some degree of manual configuration to get it up and running:
Open the fsproj file in a text editor and modify the PropertyGroup
for your Debug and Release builds by changing the start action to resemble the following:
<StartAction>Program</StartAction>
<StartProgram>$(APPDATA)\npm\node_modules\azure-functions-core-tools\bin\func.exe</StartProgram>
<StartArguments>host start</StartArguments>
Now when we hit Start in Visual Studio, we'll be able to run the Function code as part of the Functions run time.
Organising Code
We've found that when working with pre-compiled functions, the best technique to successfully maintain code is to follow a similar structure to the script-based approach. That is, have one subdirectory for each function within your functions application. Each of these subdirectories then contains the code for the function as well as the JSON file which describes the function. You can see this structure in the image below.
Writing Functions with the F## Azure Storage Type Provider
Now that we've got a function application created locally, we're able to start to develop the functions which make it up.
Creating our first Azure Function
As part of our function application we'll have a single function which receives a message from a queue and then inserts the contents of that message into an Azure Storage Table using an updated structure. Whilst this is a relatively simple problem, it forms the basis for many background processing tasks which we're likely to be performing. For example, the message ccould point to an image in Azure Storage which we want to process and extract information from in order to insert into a low latency data store to be consumed by an API or consumer application.
Our function will be called AnalyseImage (although we won't actually be doing any image analysis here), which is invoked with a message from an Azure storage queue. This is shown below in the surrounding code for the Function.
module AnalyseImage
open Microsoft.Azure.WebJobs
open Microsoft.Azure.WebJobs.Storage
type ImageAnalysisRequest =
{ User : string
ImagePath : string }
type ImageProcessingResult =
{ FaceId : int32 option
Text : string option }
let Run(message:ImageAnalysisRequest, log:TraceWriter) =
async {
return ()
} |> Async.StartAsTask
Creating Bindings
Alongside this function definition, we need a function.json
file which is used to bind the function to the runtime, exposing information about how and when the runtime should invoke the function. In our case, we only have a single trigger in the form of a queue message and so our function.json
file is relatively simple, requiring only the following:
{
"scriptFile": "ImageAnalyser.dll",
"entryPoint": "AnalyseImage.Run",
"bindings": [
{
"name": "message",
"type": "queueTrigger",
"direction": "in",
"queueName": "images-analysis",
"connection": "AzureWebJobsStorage",
"authLevel": "anonymous"
}
]
}
The Azure Storage Type Provider
Now we're in a position modify the code to actually insert the data from the message into the Azure Storage Table. Whilst we could create a native binding in the Azure Function, we have other tools available to us in F#. One such tool is the Azure Storage Type Provider. Type Providers are typically used in F## when we want to perform some exploration of an existing data store to understand the structure of the data better.
However, the Azure Storage Type Provider recently added support for schema files which allow you to specify the types which will be stored in the Azure Storage table in the form of a JSON file; the type provider then deals with all of the boilerplate required when accessing these types from storage.
Before we can start using the type provider we need to install it from NuGet, use your NuGet client of choice to install the package. Below is the command required to install it with my preferred NuGet client, Paket.
paket add FSharp.Azure.StorageTypeProvider
Supplying a schema
We'll now create a schema for the types stored within the Azure Storage Table. Schema files are especially useful with the Storage Type Provider when you have a well-known schema that you want to develop against, but don't necessarily know the connection string at compile time. We can create a schema easily, by creating a JSON file which contains the columns and their types within Azure Storage. You can see an example JSON schema file below. We've created a number of columns within which we can store a textual representation of the data within the image. We have the opportunity to use many of the types we're familiar with in F#; however we have a limited selection driven primarily by the limitations on types available for use in Azure Table Storage.
{
"Images": {
"Path": {
"Type": "String"
},
"Format": {
"Type": "string"
},
"FaceId": {
"Type": "int32",
"Optional": true
},
"Text": {
"Type": "string",
"Optional": true
}
}
}
Assuming this file is stored in the same directory as the project with the name TableSchema.json
, we're able to use it with the type provider as follows (assume the processImage
function has been implemented to return some data about the image).
open FSharp.Azure.StorageTypeProvider
open FSharp.Azure.StorageTypeProvider.Table
open System.IO
type TableSchema = AzureTypeProvider<tableSchema = "TableSchema.json">
let Run(message:ImageAnalysisRequest, log:TraceWriter) =
async {
let storageConenctionString =
Environment.GetEnvironmentVariable("AzureStorageConnectionString")
let processedImage : ImageProcessingResult =
processImage message.User message.ImagePath
let fileExtension = Path.GetExtension message.ImagePath
let entity =
TableSchema.Domain.ImagesEntity(Partition message.User,
Row message.ImagePath,
message.ImagePath,
fileExtension,
processedImage.FaceId,
processedImage.Text)
let! insertResult =
TableSchema.Tables.Images.InsertAsync(entity,
connectionString = storageConnectionString)
match insertResult with
| SuccessfulResponse (entityId, statusCode) -> log.Info(sprintf "Successfully processed image %s" message.ImagePath)
| BatchError(entity, statusCode, errorCode)
| EntityError (entity, statusCode, errorCode) -> log.Error(sprintf "Unable to process image %s" message.ImagePath)
| BatchOperationFailedError(entity) -> log.Error(sprintf "Unable to process image %s" message.ImagePath)
} |> Async.StartAsTask
As we can see in the code above, we are first creating the type provider's types by passing in the table schema file. That's all that's needed to get the type provider up and running. Since all of this is being done offline, we don't need to provide details of the Azure Storage account we're using this against and instead we can simply declare the overall structure of the table. From here we just implement the body of our function which we wrote the boilerplate for above.
F## and Azure Interop
We're able to use async within Azure functions as long as we make sure to start the async workflow as a Task by calling
. Note that within the workflow, we can retrieve a value stored in App Settings by accesssing the environment variables which is how we access the Azure Storage connection string.Async.StartAsTask
Type Provider usage
We call a domain specific function to perform some processing work which returns some data about the image. Now we can start to use the type provider. We first create an instance of the ImagesEntity
- a type created by the type provider which is stored within the given Table. In addition to the columns we defined earlier in our JSON file, the entity also has another two values which represent the Partition and Row keys as required by an Azure Storage table. The next step is to simply insert into the table using the generated Insert function; it's only at this point that we need to supply the connection string to the type provider. Finally, we're able to match on the result of the insert operation which has been mapped over to a more idiomatic F## type, allowing us to see whether the operation was successful or not.
Summary
With functions and F## we get an extremely simple approach to building event-driven services.
We don't need to worry about all of the supporting infrastructure which might complicate matters or cause issues. In the example we saw here, we have a background process which runs only on demand whenever we receive a message on a queue. We don't need to worry about the virtual machines which the code runs on and needs to be maintained and as the code only runs whenever a message is received, this means we're only billed for the time spent doing actual work.
In addition, the use of F## allows us to easily reason about code, as well as use Type Providers which provide strong typing at compile-time that wouldn't be possible in other languages. Whilst this example illustrates a relatively simple use-case that could also be implemented using standard Azure Functions Table bindings, more complex situations often require the use of explicit use of the Azure SDK - which is where the Type Provider is especially useful.