Database deployment is a fundamental part of the continuous delivery cycle.
We need to be able to make changes to our schema and deploy it to multiple environments with as little friction as possible.
This allows us to react to change and make updates quickly and efficiently.
In this article we will show how you can manage your SQL database schema in Visual Studio and then automatically deploy it to the cloud.
We are assuming that you are working with an app created with the SAFE stack template, and therefore have a FAKE build script configured.
However, if you have another kind of web app, much of the advice is the same and so should still be useful.
Creating a Database project
The first thing we will need to do is install SQL Server Data Tools for Visual Studio (SSDT).
This will allow us to add a Database project to our solution.
Database projects allow us to create and manage our schema without a live connection to the database, using the table designer and T-SQL.
If you like to work with SQL Server's visual designer, you can use schema compare to synchronise your changes.
They provide intellisense and help us to focus on data relationships, automatically creating the scripts required to apply any changes.
You can also track all changes to the schema in source control.
The easiest way to add SSDT to Visual Studio is by opening the VS Installer, choosing to modify your installation, and selecting the Data storage and processing workload.
Tools
We will need to install Fake.Sql from nuget.
dotnet paket add Fake.Sql -g Build
Local deployment
Deployment to a local database is quite simple, especially if you are familiar with FAKE.
Build
To create a deployable .dacpac
schema file, we just need to build our database project.
To do this we will add a new Target
to our FAKE build script:
let dbProjectPath = @"src\Database"
let dbOutputPath = "output"
Target.create "BuildDb" (fun _ ->
MSBuild.build (fun buildparams -> { buildparams with MSBuildParams.Properties = [("OutputPath", dbOutputPath )] } ) dbProjectPath
)
Deploy
Once we have built our dacpac file we will use another Target
, DeployLocalDb
, to deploy our schema using SqlPackage
:
open Fake.Sql
let dbName = "MyDatabase"
Target.create "DeployLocalDb" (fun _ ->
let connectionString = sprintf @"Data Source=(localdb)\MSSQLLocalDB;Database=%s;Integrated Security=True" dbName
SqlPackage.deployDb (fun args -> { args with Source = dbPath; Destination = connectionString; Timeout = Some 120 }) |> ignore
)
Dependencies
Finally, we need to update our Target
dependencies at the bottom of the file.
We want a Deploy
to require a Build
, and a Build
to require a Clean
.
"Clean"
==> "BuildDb"
"BuildDb"
==> "DeployLocalDb"
We haven't combined these into one dependency chain as the build / clean part will be used again for cloud deployment below.
That's it. Now if you run
dotnet fake build -t DeployLocalDb
your database should be created for you with all the required tables.
An easy way to confirm this is to open the SQL Server Object Explorer window, drop down your
localdb
instance and look in itsDatabases
folder.
Cloud deployment with Azure and Farmer
Deploying your database to the Azure cloud using Farmer is a little more involved, but when you consider how much you are achieving with such few lines of code it is pretty impressive.
The process is in essence quite similar to the local database described above - in fact we will use the same BuildDb
target to create our .dacpac
file.
Azure
If we want to deploy our schema to the cloud, we will need to provision the resources to host it.
In the case of Azure SQL, this consists of two parts:
- A database server
- A database instance on that server
Farmer
Rather than create these resources by hand in the portal, you can automate the process by defining them in an Azure Resource Manager (ARM) template and deploying it using the Azure CLI.
This is, however, a rather laborious process as ARM templates are extremely verbose and require a fair degree of understanding of the Azure infrastructure to construct.
For this reason, we use Farmer to create and deploy our ARM templates. It provides a concise and easy to learn F# DSL with sensible defaults configured for you.
The SAFE template comes with a Farmer script which you can execute using the Azure
FAKE target.
By default, this creates a simple web app and adds it to a new resource group.
We will edit this target to include the other resources we require.
SQL Server / SQL database
The first thing we will define is a sqlServer
resource, which will also contain the database that we want to create.
The password we wish to use for the database is not written down here, as it would end up in version control.
Rather, it is loaded in from an environment variable on the machine that is doing the deployment.
let dbServerName = "DbServer"
let dbUsername = "DbAdmin"
let dbPassword () = Environment.environVarOrFail "DbPassword"
Target.create "Azure" (fun _ ->
let database = sqlServer {
name dbServerName
admin_username dbUsername
enable_azure_firewall
add_databases [
sqlDb { name dbName }
]
}
...
WebApp
Next, we will update the existing webApp
resource to store the database connection string generated by the sqlServer
in our app settings.
This will allow us to load it into our application at runtime and use it to access the database.
By default, App settings are visible in plain text in the Azure portal. In a production app it is always recommended that you store your connection strings and other sensitive info in a KeyVault.
You can then directly load them into your application at runtime.
Creating a KeyVault can also be done with Farmer. The simplest way to store all your resource-derived settings in a KeyVault is to simply add use_keyvault
to the webApp
. This will provision a KeyVault, store the connection string in it and load it up at runtime automatically using key vault references.
See my previous blog about secrets and configuration for a deeper dive on the topic.
You will need to be on at least Farmer version 1.2.0 to add
use_keyvault
. You can update by runningdotnet paket update farmer -g build`
Once the update is complete, build the application to make the command show up.
...
let dbConnStringKey = "DbConnectionString"
let web = webApp {
name deployName
use_keyvault
setting dbConnStringKey (database.ConnectionString dbName)
zip_deploy artifactDir
depends_on database
}
...
Resource Group
Underneath that, we will now update the arm
resource to include our database.
...
let deployment = arm {
location Location.WestEurope
add_resource web
add_resource database
}
...
Outputs
Finally, when deploying we will need to pass in the database password that we wish to use.
...
let dbPasswordKey = sprintf "password-for-%s" dbServerName
deployment
|> Deploy.execute deployName [dbPasswordKey, dbPassword ()]
|> ignore
)
Deploy
Now we need to create a new target, DeployDb
, which will apply our schema to the database in Azure. This is very similar to the LocalDb version but using a different connection string built from the deployed database details.
Target.create "DeployDb" (fun _ ->
let connectionString = sprintf "Server=tcp:%s.database.windows.net,1433;Initial Catalog=%s;Persist Security Info=False;User ID=%s;Password=%s;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;" dbServerName dbName dbUsername (dbPassword())
SqlPackage.deployDb (fun args -> { args with Source = dbPath; Destination = connectionString; Timeout = Some 120 }) |> ignore
)
Dependencies
As before, we need to update the Target
dependency chain.
Just add the following:
"BuildDb"
==> "DeployDb"
and update the Azure
chain like so
"Clean"
==> "InstallClient"
==> "Bundle"
==> "Azure"
==> "DeployDb"
Now when you run dotnet fake build -t DeployDb
the database resource will get deployed, and then your schema will be built and applied to it.
Don't forget to set an environment variable with the key "DbPassword" before you run this target!
Farmer will launch the Azure CLI tool in order to communicate with Azure. If this is the first time that you have run the tool, you will be required to log in.
Conclusion
Setting up automated infrastructure may not immediately seem as exciting as jamming on code, but if you spend some time at the start of a project setting it up properly it will make the whole process so much more enjoyable.
In terms of business, it is an essential part of our job as modern software engineers. It empowers our stakeholders to tailor their products to a changing market and ensures our users get regular updates and improvements to their services.
As you can see, it needn't be especially elaborate and complex, especially when using great tools such as Farmer and FAKE.