United Kingdom: +44 (0)208 088 8978

Automated environments with Docker and the SAFE Stack

Ryan shows how you can save time and gain consistency with automated environment deployments by integrating Docker Compose into your FAKE build scripts.

We're hiring Software Developers

Click here to find out more

I'm going to pick up from where Matt's previous blog left off. If you have followed along so far, you will

  • Be familiar with what Docker is and why you might want to use it
  • Have Docker Desktop installed
  • Have downloaded images of SQL Server and / or Azurite
  • Have created containers from the images and mapped the ports

This has provided you with a clean deployment of the applications which is easy to tear down or upgrade, leaving the host machine unaffected.

Whilst this is already an improvement from managing local installations, we can go further.

As it stands, we

  • Have to remember to start and stop the containers when we need to use them
  • Have a single instance of each service which could be shared across applications

A popular next step is to automate the starting / stopping of containers and use a dedicated instance per application (or even per branch).

Docker provide a tool called Compose which allows you to define YAML scripts that describe a deployment, including

  • Services to be started
  • Variables they need
  • Ports which should be mapped

When developing SAFE Stack web apps, we use FAKE as our build tool. This allows us to define a set of build tasks (called 'Targets') and the dependencies between them, which can then be selectively executed.

By integrating Docker Compose into the FAKE script, we can get the behaviour we want.

For the following instructions I am going to assume you have made a new SAFE app using the template.

Compose file

Create a file at the root of your solution called compose.yml. Paste the following contents into it:

services:
  sql-server:
    image: mcr.microsoft.com/mssql/server
    container_name: sql-server-myAppName
    ports:
      - "1434:1433"
    environment:
      - ACCEPT_EULA=Y
      - SA_PASSWORD=yourStrong(!)Password
  azurite:
    image: mcr.microsoft.com/azure-storage/azurite
    container_name: azurite-myAppName
    ports:
      - "10000:10000"
      - "10001:10001"
      - "10002:10002"

The SQL Server port mapping isn't strictly necessary, however this prevents it clashing with LocalDb or other local SQL Server instances if you have them running.

Replace myAppName with the name of your application. This isn't strictly required but container names must be unique, so if you had two apps with the same names they would clash.

Alternatively, you may pass in the app name as an environment variable and append it dynamically. This can be powerful, however if you don't have the variable set for some reason then Compose will overwrite the existing container, so we are just hard coding it here for simplicity.

FAKE Target

Open Build.fs at the root of the solution and paste in the following target

let docker = createProcess "docker"

Target.create "StartServices" (fun _ ->
    async { runParallel [ "Docker Services", docker "compose up" "." ] } |> Async.Start
)

The runParallel function runs the process with a name tag and colourisation in the console, so we make use of it here even though we have a single process to execute.

Finally, add the following dependency at the bottom of the same file

"StartServices"
    ==> "Run"

That's it. if you now execute...

dotnet run

...you should see the containers be created if they don't already exist, and just start up again otherwise.

If you kill the application in the console using Ctrl c then you should observe the containers shut down.

Conclusion

At first Docker can seem like a solution looking for a problem - what you have works, why change it? That's certainly how I felt to some degree.

Once I realised that it gave me quick installations that were easy to manage, I enjoyed using it in my local development workflow.

This naturally led to new questions such as 'how do I integrate it into my build pipeline'? (Stay tuned for that one!)

Shortly afterwards, I worked on an application that integrated with a remote team's API, still heavily in development and using a different language.

Docker allowed them to publish images and for us to easily pull and integrate new versions of the software and environment as a complete standalone package, both locally and in production, which was a great experience.

I encourage you to start pulling at the thread as there is much to be explored!