Practical .NET

Dealing with Databases (and Data) in Docker

There are lots of ways to handle databases in a containerized environment like Docker. Here's how to create SQL Server in a container, how to load it with the data you want and how to integrate that container into whatever project needs it. Be warned: Some PowerShell is used.

In a previous article, I showed how to set up two projects in a solution so that, when you press F5 to start debugging, those projects would start in separate Docker containers. Since one of the projects was a Web Service called from the other project (an ASP.NET MVC Core application), I also showed how to call a Web Service living in one container from an application in another container.

However, what I ignored was how either of those projects would retrieve data from a database. There's more to this than you might hope because I have several criteria for working with SQL Server in a "containerized" environment. I want:

  • A container for the database and database engine so that I can deploy it, manage it and scale it separately from the other parts of my application
  • A database pre-loaded with my data
  • The ability to reload the data in the database when I want (either because I've updated my test data or I want to restore the test data to its original state)

The simplest solution that meets all these criteria, I've decided, is to create my own SQL Server Docker image that I can add to any project that needs it.

Really: This is the simplest solution.

To do that, I need to use a series of PowerShell commands. I could just enter those into Visual Studio's Package Manager Console window. However, I want to be able to recreate my database container with new data whenever I want. To support that, I use the PowerShell ISE editor to create a PowerShell script of my commands. When I need to refresh my data, I just rerun the script.

Creating the Initial Image
I don't have to start creating my image from scratch, though: Microsoft provides images for all the various versions of SQL Server. I chose to use the one at microsoft/mssql-server-linux with the tag :latest -- this gets me the latest version of SQL Server Developer Edition.

My first PowerShell command is to tell Docker to pull that image and keep it handy on my computer:

docker pull microsoft/mssql-server-linux:latest

Be prepared: The first time that you run this command, it will take a while to download the image. If you run this command again, however, the command will just make sure that your local version is up-to-date and won't download the image again.

The next step is to start up the image and have it run as a container. Because this container holds an instance of SQL Server, I have a couple of configuration options to set. I must:

  • Accept SQL Server's EULA and set the system administrator's password
  • Set a connection port. SQL Server listens on port 1433, which I must expose on some port on my container (lacking originality, I just tie the internal port 1433 to my container's external port 1433)
  • Assign the container (not the image) the name custdb
  • Detach the container to run as a background process

Here's the single command that does all of that:

docker run -e "ACCEPT_EULA=Y" `
   -e "SA_PASSWORD=<strong password>" `
   -p 1433:1433 `
   --name custdb -d microsoft/mssql-server-linux:latest

You need to make sure that whatever password you use meets SQL Server's criteria for a strong password or this command will fail.

This is, by the way, not the ugliest command that will be required in this script.

Copying in Data
I also decided that the easiest way to load data into my "containerized" SQL Server instance was to create a backup from some existing database that held the data I wanted. I could then restore that data into my "containerized" SQL Server. This also positions me to load different datasets from different backups.

After using SQL Server Management Studio to create the backup, the next step in my script was to create a folder to hold my backup file in my custdb container. That's this command:

docker exec -it custdb mkdir /backups

Now I have to copy the backup file I'd created (CustomerData.bak) into the folder in my custdb container. I use PowerShell's cd command to make the folder holding the backup file the current folder to shorten up the file path to my backup file and then copy the file:

cd c:\MyProject\BackupFiles
docker cp CustomerData.bak custdb:/backups

The next command is, in fact, the ugliest command in my script. This command accesses the sqlcmd utility in my custdb container and uses it to execute SQL Server's RESTORE command. That RESTORE command does go on:

docker exec -it tododb /opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P "Pa55w0rd" `
-Q `
@"
 RESTORE DATABASE TodoDatabase FROM DISK = '/backups/todoDatabase.bak' 
         WITH MOVE 'Todo' TO '/var/opt/mssql/data/TodoDatabase.mdf', 
              MOVE 'Todo_log' TO '/var/opt/mssql/data/TodoDatabase_Log.ldf'
"@

Storing the Image
I now have a container with the data that I want to use. However, the tool I use in Visual Studio to build my application is Docker Compose, which wants to pull an image from a repository. To support this, my next step is to create a repository container. Fortunately, Docker provides an image of a repository-in-a-container I can just start running.

This command starts the image (downloading it, if necessary), exposes the repository port 5000 as port 5000 on the container, tells the repository to restart if it shuts down and (finally), assigns it the name MyLocalRegistry:

docker run -d -p 5000:5000 `
          --restart=always `
          --name MyLocalRegistry registry:2

My last line in my PowerShell script takes my custdb container and puts it in MyLocalRegistry as an image under the name custdb-image:

docker commit custdb MyLocalRegistry:5000/custdb-image

Now, when I want to load new data into my SQL Server container, I just create a new backup, drop it into the right folder and rerun my script. I should delete the existing images and repository first, so my script begins with these commands:

docker stop tododb
docker rm tododb
docker stop MyLocalRegistry
docker rm MyLocalRegistry

Integrating with the Project
Now that I have this image, I can integrate it into any project that needs it. As I showed in my earlier article, Docker creates containers based on information in Dockerfiles.

My first step is, therefore, to add a Dockerfile to my Visual Studio solution. To do that, at the top of Solution Explorer, I right-click on the solution and select Add | New Item. From the Add New Item dialog, I select text file and make up a name for my Dockerfile. I used CustDBDocker.txt (while the convention is for Dockerfiles to have the name Dockerfile with no extension, you can use any name you want and give the file an extension). All I have to put in CustDBDocker.txt is a FROM command that refers to my image in my local repository:

FROM MyLocalRegistry:5000/custdb-image

To have Docker Compose create a container from my Dockerfile, I need to reference the Dockerfile from my solution's docker-compose.yml file (see that previous article for how to get Visual Studio to create the Docker-compose-yml file for you). For my SQL Server container, I add this block, naming my new container custdb and referencing both the image and my Dockerfile:

custdb:
       image: MyLocalRegistry:5000/custdb-image
       build:
         context: .
         dockerfile: CustDBDocker.txt

One warning: A docker-compose.yml file is a YAML file and, as a result, indentation is critical. The image and the build lines must line up (and be indented from tododb), for example, and so must the context and dockerfile line (which must be indented from build). The custdb line must line up with the names of the other blocks in the file.

Now, when I press F5 (using the docker-compose project I created in my last column as my startup project), my database container will also start up with my application and Web Services containers.

I have only one change left to make: I need to rewrite the connection strings in the projects that use the database to aim them at my new database container. That requires two changes:

  • The server name in the connection string now becomes the container name
  • I use the sa userid with the password I set in my PowerShell script

The new connection strings look like this:

server=custdb;Initial catalog=TodoDatabase;User=sa;Password=<strong password>;MultipleActiveResultSets=True;"

Between this column and the previous one, you now have all the tools you need to run something like a real-world, modern application in Docker. And you're also now positioned to release any (or all) of these containers to production in any combination that makes sense to you.

Welcome to Docker.

About the Author

Peter Vogel is a system architect and principal in PH&V Information Services. PH&V provides full-stack consulting from UX design through object modeling to database design. Peter tweets about his VSM columns with the hashtag #vogelarticles. His blog posts on user experience design can be found at http://blog.learningtree.com/tag/ui/.

comments powered by Disqus

Featured

Subscribe on YouTube