United Kingdom: +44 (0)208 088 8978

Setting up a CI build, test and package pipeline in 5 steps

For Farmer, we wanted to set up a centralised build, test and package pipeline. I want to illustrate some of the simplifications for doing this in Azure DevOps which are brought about in large part by the fact that the dotnet SDK now has some very good support for command-line tooling out of the box.

1. Build and Test

Creating a new project with the dotnet SDK is as simple as running dotnet new classlib -lang F#; calling dotnet build will implicitly run a restore (either Nuget or Paket, depending on your tool of choice) and then kick off a build of the solution or project in the current directory.

Creating test projects with the dotnet SDK is also relatively easy. Out of the box, you have support for MSTest, NUnit and XUnit test projects. In our case, we initially opted for xUnit, via dotnet new xunit -lang F#. Importantly, this template comes with all the required package dependencies to work with the dotnet test command, which will run all unit tests:

We actually decided to migrate to Expecto which offers some advantages over other test runners, one being that it's composed from simple functions rather than implicitly through attributes - which can simplify many aspects of test orchestration. Expect fully supports dotnet test as well.

Test run for C:\Users\Isaac\Source\Repos\Farmer\src\Tests\bin\Debug\netcoreapp3.1\Tests.dll(.NETCoreApp,Version=v3.1)
Microsoft (R) Test Execution Command Line Tool Version 16.5.0
Copyright (c) Microsoft Corporation.  All rights reserved.

Starting test execution, please wait...

A total of 1 test files matched the specified pattern.

Test Run Successful.
Total tests: 5
     Passed: 5
 Total time: 6.1264 Seconds

dotnet also supports adding project references: Calling dotnet add reference MyClassLib will add MyClassLib to the test project in the current directory. Similarly, you can add a project to a solution with dotnet sln add MyTestProject.

2. Packaging

Creating a NuGet package requires using the dotnet pack command. This will scan through the current project structure and create nuget packages as required. A small amount of up-front configuration is required in order to set up your project files so that NuGet creates the packages with the correct properties, e.g. here's a subset of Farmer's project file:

<PropertyGroup>
  <!-- General -->
  <AssemblyName>Farmer</AssemblyName>
  <Version>0.12.3</Version>
  <Description>A DSL for rapidly generating non-complex ARM templates.</Description>
  <Copyright>Copyright 2019, 2020 Compositional IT Ltd.</Copyright>
  <Company>Compositional IT</Company>
  <Authors>Isaac Abraham and contributors</Authors>

  <GenerateDocumentationFile>true</GenerateDocumentationFile>

  <!-- NuGet Pack settings -->
  <PackageId>Farmer</PackageId>
  <PackageTags>azure;resource-manager;template;dsl;fsharp;infrastructure-as-code</PackageTags>
  <PackageReleaseNotes>https://raw.githubusercontent.com/CompositionalIT/farmer/master/RELEASE_NOTES.md</PackageReleaseNotes>
  <PackageProjectUrl>https://compositionalit.github.io/farmer</PackageProjectUrl>
  <PackageLicenseExpression>MIT</PackageLicenseExpression>
  <PackageRequireLicenseAcceptance>true</PackageRequireLicenseAcceptance>
  <RepositoryType>git</RepositoryType>
  <RepositoryUrl>https://github.com/CompositionalIT/farmer</RepositoryUrl>

  <!-- SourceLink settings -->
  <IncludeSymbols>true</IncludeSymbols>
  <SymbolPackageFormat>snupkg</SymbolPackageFormat>
</PropertyGroup>

3. Tieing into Azure Devops

Performing a build, test and package is nowadays a relatively simple task in Azure Devops, although you will need to get your hands a little dirty with YAML files. Creating a file in your repository called azure-pipelines.yml will automatically get picked up by Azdo - here's a slightly simplified version of the Farmer one:

First, we set the trigger for what branch to run on, and what OS image to use for the pipeline:

trigger:
- master
pool:
  vmImage: windows-latest

Next, we specify a task to run all tests; this will have the side-effect of restoring and building the solution as well.

steps:
- task: DotNetCoreCLI@2
  displayName: 'Restore, Build and Test'
  inputs:
    command: 'test'

The DotNetCoreCLI@2 task is a built-in Azdo task, but it's essentially a wrapper around calling dotnet with different commands - in this case, test. It also automatically scans for test run outputs and surfaces them in Azdo.

4. Storing artifacts

I wanted to simply expose the generated package so that anyone can test it out quickly and easily. The first step is that after the tests run, to package up the Farmer project into a NuGet package:

- task: DotNetCoreCLI@2
  displayName: 'Package'
  inputs:
    command: 'pack'
    packagesToPack: 'src/Farmer'
    configuration: Release
    versioningScheme: 'off'
    verbosityPack: 'Normal'

This also uses the DotNetCoreCLI Azdo task, but this time we pass in the pack command - just like we did locally.

Azdo comes with a handy artifacts staging capability which we can use to expose the NuGet package:

- task: PublishBuildArtifacts@1
  displayName: Store NuGet Package
  inputs:
    PathtoPublish: '$(Build.ArtifactStagingDirectory)'
    ArtifactName: 'FarmerPackage-$(imageName)'
    publishLocation: 'Container'

5. Multi-targeting different OSes

One thing I wanted to do was run the build and unit tests on both Linux and Windows. This was important, because one of the things Farmer does is provide a simple API that wraps around the Azure CLI; however, Linux and Windows both have subtle differences in the way that shelling out works in .NET, such as paths. Luckily, it only took a few minutes for me to find a sample for this.

strategy:
  matrix:
    Linux:
      imageName: 'ubuntu-latest'
    Windows:
      imageName: 'windows-latest'
pool:
  vmImage: $(imageName)

This now causes two builds to kick off every time we commit to the master branch of Farmer:

Seeing the build in action

You can see the latest Farmer CI builds here. Azdo provides us with a simple list of builds for each commit, which can be drilled into for the outputs of each stage.

Azo also provides some pleasant analytical capabilities on top of unit tests:

.

Some of this is pretty standard in terms of the tasks that are required for a common CI/CD pipeline. However, the amount of code required to do this is relatively small, and importantly, the commands being run in your CI and local process are essentially the same, simple dotnet commands; there's no need to use complex wizards in a GUI to set this stuff up any more.

Have (fun _ -> ())

Isaac