July 2017

Volume 32 Number 7

[Data Points]

On-the-Fly SQL Servers with Docker

By Julie Lerman

Julie LermanIn last month’s column (msdn.com/magazine/mt809115), I used the Visual Studio Code mssql extension to interact with an Azure SQL Database—from my MacBook Pro! I love the cross-platform capability of Visual Studio Code, but I was still depending on Azure to provide me with some flavor of SQL Server. I also mentioned that it is indeed possible to run SQL Server without depending on the cloud or Windows, thanks to SQL Server for Linux and a Docker image that’s readily available to drop onto your machine. I’ll pause if the existence of SQL Server for Linux is new information for you and you just need a moment.

OK, I hope you’ve recovered. The Linux version of SQL Server is able to run in a Linux-based Docker container and you can use that container anywhere that Docker is supported.

Running SQL Server in a container is really useful for developers because it makes it easy and quick to spin up a SQL Server instance. You can run different instances of SQL Server that are different versions side by side. You can easily run the container for a while and then when you stop/delete the container, the data all goes away and you can start another clean one. If you want, you can also choose to persist the database files, but still start/stop the container only as you actually need SQL Server to be running.

There are two flavors of SQL Server container images—­Linux-based images and Windows-based images. Microsoft provides a Linux-based image for SQL Server 2017 Developer Edition and three Windows-based images: SQL Server 2016 SP1 Express Edition, SQL Server 2016 SP1 Developer Edition and SQL Server 2017 Evaluation Edition. All of these images are available on Docker Hub for free (dockr.ly/2rmSiWs). You can pull and run the Linux-based SQL Server image to create Linux-based containers wherever there’s a Docker Engine running, which could be on Linux, Windows or macOS. But you can run Windows-based containers only on Windows 10 Anniversary Edition or higher or Windows Server 2016—they can’t run on Linux or macOS. Keep in mind that Docker also runs on Azure and AWS, so you can move from development to production in the cloud, as well.

Although I’ve written a blog post about my first experiments with running SQL Server for Linux in Docker on a Mac (bit.ly/2pZ7dDb), I want to approach this from a different angle—the scenario where you want to share a pre-configured image along with a database. This can allow developers to very quickly get SQL Server and the needed databases on their machines, or even to be used as part of an automated testing environment. I knew it could be done, but I was curious as to how, so I’ve worked through the basic process and will share it with you here.

I’ll begin by explaining how to get the base image of SQL Server for Linux up and running on your computer under Docker. My example will use Docker for Mac, but you can do the same with the other versions of Docker, as well. Be sure you have the correct version of Docker already installed and running on your computer, and that you set it to use at least 4GB of RAM from the host system. You can find more detailed setup information on my blog post referenced earlier.

In the command or terminal window, you can get the official image by executing:

Mac: sudo docker pull microsoft/mssql-server-linux
Windows: docker pull microsoft/mssql-server-windows

Once it’s installed, you can run the docker images command to see that Docker is aware of this image:

REPOSITORY                     TAG     IMAGE ID      CREATED      SIZE
microsoft/mssql-server-linux   late    7b1c26822d    13 days a    1.35 GB

Note that if you already have the base image, you can visit dockr.ly/2qTavYr to see if a newer one has been released. If so, you should pull it again. Next, I’ll use the docker run command to spin up this image as a container in order to interact with it:

docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=Passw0rd' -p 1433:1433 -d --name juliesqllinux microsoft/mssql-server-linux

When starting up an mssql-server-linux container the first time, you’re required to accept its license agreement, which you do using an environment variable:

-e 'ACCEPT_EULA=Y'

You also must include an environment variable for a password to accompany the default sa user. The password you create must consist of “at least 8 characters including uppercase, lowercase letters, base-10 digits and/or non-alphanumeric symbols.” Optionally, you can name the container instance. I’ll call mine juliesqllinux. I’ll also specify the port mapping for the host port to container port with -p, as well as a parameter to run this in the background: -d (for detach). If you don’t use the -d parameter, the instance will run in the foreground and you won’t get a prompt back at your terminal in order to continue executing commands. As usual, I learned this the hard way.

In response to the run command, Docker returns a unique container ID and returns your terminal back to a prompt. It does this in about one second. Think about how long it takes to install SQL Server onto your computer and get it configured. Then let me repeat: Spinning up the container took about one second.

Interacting Directly with the SQL Server from the Command Line

While my last column focused on using the mssql extension for Visual Studio Code to interact with the database, here I’ll use a command-line tool to quickly create a simple database, a table and some data, and then run a query to prove it’s all working. Although the Microsoft sqlcmd command-line utility is part of the image, I find it easy enough to use it directly from my OS. Sqlcmd is available for Windows (bit.ly/2qKnrmh) and macOS (bit.ly/2qZBS6G). The macOS version is new. While I’ll use it in this article, I’m also a fan of the cross-platform sql-cli (bit.ly/2pYOzey).

When starting up sqlcmd, you need to specify the server name, user name and password as part of the command. Additionally, you can specify a particular database, although the utility will use master by default. See the sqlcmd for Windows link mentioned earlier for details on all of the command-line options.

From the command prompt, I’ll start up a sqlcmd command with the required parameters. This returns a numbered prompt so I can use it interactively:

→  ~ sqlcmd -S localhost -U sa  -P Passw0rd
1>

At the prompt I can start entering lines of TSQL, then a final line, Go, to execute. For example, I’ll retrieve a list of databases that already exist on the server:

1> select name from sys.databases
2> go
name
------
master
tempdb
model
msdb
(4 rows affected)
1>

Now, I’ll create a new database and execute that. Then I’ll use that new database, create a new table and add some data using the commands in Figure 1.

Figure 1 Adding Data to a New Table

1> create database juliedb
2> go
1> create table dbo.people (PersonId int Primary Key, Name nvarchar(50))
2> insert into people values (1,'julie')
3> insert into people values (2,'giantpuppy')
4> select * from people
5> go
(1 rows affected)
(1 rows affected)
PersonId    Name
----------- --------------------------------------------------
          1 julie
          2 giantpuppy
(2 rows affected)
1>

I find it a little clunky to work interactively with sqlcmd. Definitely have a look at the new cross-platform mssql-scripter tool (bit.ly/2pSNhoF) that was recently released as a preview.

The final command, select * from people, is there to perform a simple validation that the table and the data do indeed exist after running the other commands. Note that when you remove a Docker container, it’s gone completely and this database and data will disappear, as well. However, it’s possible to create separate data volumes that can persist the data files even if you destroy the container that’s running your SQL Server instance. I walked through the first baby steps of creating persistent data containers in the blog post I mentioned earlier, so I won’t repeat that here.

Create a Custom Image That Creates Its Own Database

What I’m more interested in is the idea of creating an image that includes not only a running instance of SQL Server, but also a pre-created database for development and testing, something developers can grab and use quickly. Of particular interest to testers is the ability to have this image accessible for automated testing, where the server and database can be instantly available for a test run, then destroyed and recreated on the fly as needed.

Consider that when I first created the container from the Docker image, the master database was created for me. If you don’t include the -d (detach) parameter in the docker run command, you can see the many steps that were performed on the container as it starts up. Creating the master database is only one of those steps. So what you can do is create your own image based on the base mssql-server-linux image (or any base image), then provide additional commands to perform any steps you need in the dockerfile file that goes along with your image.

I’ll demonstrate by creating an image that will duplicate the steps I just ran in the terminal. These were the commands to create the new database and a new table and to insert a few rows of data.

I’ll need a folder to house the files I’ll be creating for my image. There will be a total of four because I’m going to separate specific tasks into different files to keep things organized:

  1. SqlCmdScript.Sql: This file will hold the TSQL script with the commands for creating the new database, table and data.
  2. SqlCmdStartup.sh: This is a bash file (like a batch file for Linux). It starts up the sqlcmd command-line tool and, as part of the command, runs the SqlCmdScript.Sql file. Remember that sqlcmd is also part of the base image.
  3. Entrypoint.sh: This is another bash file. It lists the non-Docker tasks that need to run, and its first task is to execute the SqlCmdStartup.sh file. Then it will start the SQL Server process.
  4. Dockerfile: This file (there’s no extension) is the definition of how to build the image and how to run containers from the image.

Here’s the file listing for SqlCmdScript.sql, the same commands I used earlier when I was working directly from the command line:

create database juliedb;
GO
use juliedb;
create table people (PersonId int Primary Key, Name nvarchar(50));
insert into people values (1,'julie');
insert into people values (2,'giantpuppy');
select * from people

Next is the SqlCmdStartup.sh. Again, this is where I start up the sqlcmd utility and tell it to run the script I just listed:

#wait for the SQL Server to come up
sleep 20s
#run the setup script to create the DB and the schema in the DB
/opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P Passw0rd -d master -i SqlCmdScript.sql

The next file is entrypoint.sh, where I tell Docker what external processes to run. This is where I trigger the SqlCmdStartup bash script that runs sqlcmd, as well as the bash script inside the base image that starts up the SQL Server database. I combine the two commands using the & character:

#start the script to create the DB and data then start the sqlserver
./SqlCmdStartup.sh & /opt/mssql/bin/sqlservr

Notice that I’m running sqlcmd first. This is important and I struggled with this for a long time because of a misunderstanding. When Docker encounters a command that then completes, it stops the container. Originally, I ran the server startup first, then the sqlcmd. But when Docker finished the sqlcmd, it decided it had finished its job and shut down. In contrast, when I run the server startup second, the server is just a long-running process, therefore, Docker keeps the container running until something else tells it to stop.

If, like me, you’re curious what’s in the sqlservr.sh script, take a look at bit.ly/2qJ9mGe, where I blogged about the file listing.

Finally, here’s the Dockerfile, which does a number of things:

FROM microsoft/mssql-server-linux
ENV SA_PASSWORD=Passw0rd
ENV ACCEPT_EULA=Y
COPY entrypoint.sh entrypoint.sh
COPY SqlCmdStartup.sh SqlCmdStartup.sh
COPY SqlCmdScript.sql SqlCmdScript.sql
RUN chmod +x ./SqlCmdStartup.sh
CMD /bin/bash ./entrypoint.sh

It first identifies the base image (mssql-server-linux), which, if not found on your machine, will be automatically pulled from Docker Hub. Then, it sets the environment variables so I don’t have to do that in the docker run command. Dockerfile then copies my two bash files and the SQL script file into the image. Next, it runs the chmod command, which permits my bash file to run inside the container. Finally, I instruct Docker what to do at the time that a new container is created from my image by specifying the CMD command to call the entrypoint bash script, which will in turn run sqlcmd with my TSQL and then start up SQL Server.

Build the New Image

With all of this in place, it’s time to build my new image with the docker build command:

docker build -t julielinuximage .

I’m using just a simple set of parameters, although there’s a lot that you can control when building images. I’m using only the -t parameter here, which will force a default tag on the image, though you can also specify tags for versioning and other purposes. Then I specify the name of the image and, finally, with the period at the end, I let Docker know that the Dockerfile I want to build from is in the current folder. 

At this point, I have the image that I can publish somewhere to share with my team. Doing so requires a registry, but you don’t have to store your images on the Docker Hub. You can create a registry on your own network, use a public or private registry on Azure, or choose one of the other myriad options for hosting Docker images. However, for this article, I’ll continue to just work locally.

My next step is to run a container from my new image. Remember that because I put my environment variables in the Dockerfile, I don’t need to include them in the docker run command. I’ll again use the -d parameter so the container will run in the background and I can continue to use the same terminal window to work with the container:

docker run -d  -p 1433:1433  --name juliesqllinux julielinuximage

This command specifies the port to run this on, the name to apply to the running container (--name juliesqllinux) and the name of the image to use (julielinuximage).

The docker ps command lets me confirm that it’s now running. I’m also running the Kitematic UI (included with the Docker client application installer), which shows the state of Docker containers and also displays logs. This means I can see not only all of the tasks accomplished by the sqlservr.sh script, but evidence of my juliedb database, tables and data being inserted. I can even see the results of the query at the end of my sql script to display the data from the people table.

Figure 2 shows the end of the logs after running this container.

The Final Lines of the Log Captured in Kitematic After Instantiating the Container with the Docker Run Command
Figure 2 The Final Lines of the Log Captured in Kitematic After Instantiating the Container with the Docker Run Command

Now, just like before, I can interact with the container using the command-line tools installed directly on my computer.

I have to start sqlcmd again, then I call a few commands to check that I have access to the juliedb database my image created. Figure 3 shows my entire interaction, with my commands in bold and the response in standard font. The commands shown are starting up the sqlcmd utility and connecting to the database server, listing the databases, using juliedb, listing its tables, querying the people table and then quitting out of the sqlcmd utility.

Figure 3 Interacting with Data in the Container That Pre-Created a Database

→ sqlcmd -S localhost -U sa  -P Passw0rd
1> select name from sys.databases
2> go
name
--------------------
master
tempdb
model
msdb
juliedb
(5 rows affected)
1> use juliedb
2> go
Changed database context to 'juliedb'.
1> select name from sys.tables
2> go
name
-----------------
people
(1 rows affected)
1> select * from people
2> go
PersonId    Name
----------- ----------------
          1 julie
          2 giantpuppy
(2 rows affected)
1> quit
→

Skip the Script, Just Provide the Database File

If you have a large database (or multiple databases) to share, you might prefer not to include all of the scripts for the schema and data. Another approach is to pre-create the database files, include those mdf files with the image, and in the Dockerfile be sure to copy the files into the container and then run TSQL to attach the files to the server.  SQL Server DBA Andrew Pruski wrote a great blog post about this approach, which you can find at bit.ly/2pUxQdP. I’ve also done this myself in the Pluralsight course I’m currently building about the mssql extension.

For Devs or DevOps

With these two approaches, you now have a simple means of sharing a pre-configured database along with a SQL Server across your team, allowing everyone to have a locally running server and a local copy of the database. But thanks to the container, nobody needs to install SQL Server on their computer or execute any scripts to get the necessary database set up for use. It’s frighteningly easy and fast.

I also mentioned using this for DevOps scenarios, perhaps with automated tests that require a database. I hope it’s not a big leap from what you’ve seen here to imagine having your build or test process spin up a container that has everything needed for your operation to quickly and easily interact with a SQL Server database. One example of this is in the MSSQL-Node-Docker demo app (bit.ly/2qT6RgK) created by Travis Wright, a program manager on the SQL Server engineering team. The demo was “created to show how SQL Server can operate in a DevOps scenario where an application developer can check in code to GitHub and then trigger a build in Red Hat OpenShift to deploy the changes automatically as pods (containers).”

There’s certainly a lot more to learn and benefit from providing SQL Server in Linux and Windows containers. It’s amazing to me that you can now “install” SQL Server almost anywhere and you don’t have to be a Microsoft platform developer to benefit from what is one of the most powerful relational databases on the planet. As somewhat of a newbie to Docker, I was curious about the ability to create an image that included my own database. And I’m always happy to share the results of my explorations. I learned a lot and hope that your curiosity isn’t just sated, but piqued enough to explore the possibilities even further.


Julie Lerman is a Microsoft Regional Director, Microsoft MVP, software team mentor and consultant who lives in the hills of Vermont. You can find her presenting on data access and other topics at user groups and conferences around the world. She blogs at thedatafarm.com/blog and is the author of “Programming Entity Framework,” as well as a Code First and a DbContext edition, all from O’Reilly Media. Follow her on Twitter: @julielerman and see her Pluralsight courses at juliel.me/PS-Videos.

Thanks to the following Microsoft technical experts for reviewing this article: Sanjay Nagamangalam and Travis Wright
Sanjay (@sanagama2) is a Principal Program Manager at Microsoft where he helps create market leading developer and management tools for the SQL Data Platform. Among other tools, his team is responsible for SQL PowerShell, SQL Server Management Studio (SSMS), SQL Server Data Tools (SSDT) and the new SQL developer experience in Visual Studio Code.

Travis Wright is a Principal Program Manager in the SQL Server engineering team currently focused on bringing SQL Server to Linux and Docker.  Previously, he was a Program Manager in the System Center and Windows Server teams bringing multiple new products to life.  Taking a break from Microsoft in between for two years, Travis ran a successful startup product development team.  He occasionally finds the opportunity to actually write code instead of PowerPoint slides and Word docs.