April 2019

Volume 34 Number 4

[Data Points]

EF Core in a Docker Containerized App

By Julie Lerman

Julie LermanI’ve spent a lot of time with Entity Framework and EF Core, and a lot of time working with Docker. You’ve seen evidence of this in many past columns. But, until now, I haven’t put them together. Because the Docker Tools for Visual Studio 2017 have been around for a while, I imagined it would be an easy leap. But it wasn’t. Perhaps it’s because I prefer to know what’s going on under the covers and I also like to know and comprehend my options. In any case, I ended up sifting through information from blog posts, articles, GitHub issues and Microsoft documentation before I was able to achieve my original goal. What I hope is to make it easier for readers of this column to find the path (and avoid some of the hiccups I encountered) by consolidating it all in one place.

I’m going to focus on Visual Studio 2017, and that means Windows, so you’ll need to ensure you have Docker Desktop for Windows installed (dockr.ly/2tEQgR4) and that you’ve set it up to use Linux containers (the default). This also requires that Hyper-V be enabled on your machine, but the installer will alert you if necessary. If you’re working in Visual Studio Code (regardless of OS), there are quite a few extensions for working with Docker directly from the IDE.

Creating the Project

I began my journey with a simple ASP.NET Core API. The steps to set up a new project to match mine are: New Project | .NET Core | ASP.NET Core Web Application. On the page where you choose the application type, choose API. Be sure that Enable Docker Support is checked (Figure 1). Leave the OS setting at Linux. Windows containers are larger and more complicated, and hosting options for them are still quite limited. I learned this the hard way.

Configuring the New ASP.NET Core API Project
Figure 1 Configuring the New ASP.NET Core API Project

Because you enabled Docker support, you’ll see a Dockerfile in the new project. Dockerfile provides instructions to the Docker engine for creating images and for running a container based on the final image. Running a container is akin to instantiating an object from a class. Figure 2 shows the Dockerfile created by the template. (Note that I’m using Visual Studio 2017 version 15.9.7 with .NET Core 2.2 installed on my computer. As the Docker Tools evolve, so may the Dockerfile.)

Figure 2 The Default Dockerfile Created by the Project Template

FROM microsoft/dotnet:2.2-aspnetcore-runtime AS base
WORKDIR /app
EXPOSE 80
FROM microsoft/dotnet:2.2-sdk AS build
WORKDIR /src
COPY ["DataApi/DataApi.csproj", "DataApi/"]
RUN dotnet restore "DataApi/DataApi.csproj"
COPY . .
WORKDIR "/src/DataApi"
RUN dotnet build "DataApi.csproj" -c Release -o /app
FROM build AS publish
RUN dotnet publish "DataApi.csproj" -c Release -o /app
FROM base AS final
WORKDIR /app
COPY --from=publish /app .
ENTRYPOINT ["dotnet", "DataApi.dll"]

The first instruction identifies the base image used to create the subsequent images and container for your app is specified as:

microsoft/dotnet:2.2-aspnetcore-runtime

Then a build image will be created based on the base image. The build image is solely for building the application, so it needs the SDK, as well. A number of commands are executed on the build image to get your project code into the image, and to restore needed packages before building the image.

The next image created will be used for publishing; it’s based on the build image. For this image, Docker will run dotnet publish to create a folder with the minimal assets needed to run the application.

The final image has no need for the SDK and is created from the base image. All of the publish assets will get copied into this image and an Entrypoint is identified—that is, what should happen when this image is run.

By default, the Visual Studio tools only perform the first stage for the Debug configuration, skipping the publish and final images, but for a Release configuration the entire Dockerfile is used.

The various sets of builds are referred to as multi-stage builds, with each step focused on a different task. Interestingly, each command performed on an image, such as the six commands on the build image, causes Docker to create a new layer of the image. The Microsoft documentation on architecture for containerized .NET applications at bit.ly/2TeCbIu does a wonderful job of explaining the Dockerfile line by line and describing how it’s been made more efficient through the multi-stage builds.

For now, I’ll leave the Dockerfile at its default.

Debugging the Default Controller in Docker

Before heading to Docker debug, I’ll first verify the app by debugging in the ASP.NET Core self-hosted profile (which uses Kestrel, a cross-platform Web server for ASP.NET Core). Be sure that the Start Debugging button (green arrow on the toolbar) is set to run using the profile matching the name of your project—in my case that’s DataAPI.

Then run the app. The browser should open pointing to the URL https://localhost:5000/api/values and displaying the default controller method results (“value1,” “value2”). So now you know the app works and it’s time to try it in Docker. Stop the app and change the Debug profile to Docker. If Docker for Windows is running (with the appropriate settings noted at the start of this article), Docker will run the Dockerfile. If you’ve never pulled the referenced images before, the Docker engine will start by pulling down those images from the Docker hub. Pulling the images may take a few minutes. You can watch its progress in the Build Output window. Then Docker will build any images following the steps in Dockerfile, though it won’t rebuild any that haven’t changed since the last run. The final step is performed by the Visual Studio Tools for Docker, which will call docker build and then docker run to start the container. As a result, a new browser window (or tab) will open with the same output as before, but the URL is different because it’s coming from inside the Docker image that’s exposing that port. In my case, it’s https://172.26.137.194/api/values. Alternate setups will cause the browser to launch using https://localhost:hostPort.

If Visual Studio Can’t Run Docker

I encountered two issues that initially prevented Docker from building the images. The very first time I tried to debug from Visual Studio targeting Docker, I got a message that, “An error occurred while attempting to run Docker container.” The error referenced container.targets line 256. This was neither informative nor helpful until I later realized that I could have seen the details of the error in the Build Output. After trying a variety of things (including a lot of reading on the Internet, yet still not checking the Build Output window, I ended up trying to pull an image from the Docker CLI, which prompted me to log in to Docker, even though I was already logged in to the Docker Desktop app. Once I did this, I was able to debug from Visual Studio 2017. Subsequently, logging out via the Docker CLI didn’t affect this and I could still debug. I’m not sure of the relationship between the two actions. However, when I completely uninstalled and reinstalled Docker Desktop for Windows, I was, again, forced to log in via the Docker CLI before I could run my app. According to the issue on GitHub at bit.ly/2Vxhsx4, it seems it’s because I logged into Docker for Windows using my e-mail address, not my login name.

I also got this same error once when I had disabled Hyper-V. Re-enabling Hyper-V and restarting the machine solved the problem. (For the curious, I needed to run a VirtualBox virtual machine for a completely unrelated task and VirtualBox requires Hyper-V to be disabled.)

What Is the Docker Engine Managing?

As a result of running this app for the first time, the Docker engine pulled down the two noted images from the Docker Hub (hub.docker.com) and was keeping track of them. But the Dockerfile also created other images that it had cached. Running docker images at the command line revealed the docker4w image used by Docker for Windows itself, an aspnetcore-runtime image pulled from Docker Hub, and the dataapi:dev image that was created by building the Dockerfile—that is, the image from which your application is running. If you run docker images -a to show hidden images, you’ll see two more images (those with no tags) that are the build and publish intermediate images created by Dockerfile, as shown in Figure 3. You won’t see anything about the SDK image, according to Microsoft’s Glenn Condron, “due to a quirk in the way Docker multi-stage build works.”

Exposed and Hidden Docker Images After Running the API
Figure 3 Exposed and Hidden Docker Images After Running the API

You can look at even more details about an image using the command:

docker image inspect [imageid]

What about the containers? The command docker ps reveals the container created by Docker Tools for Visual Studio 2017 calling docker run (with parameters) on the dev image. I’ve stacked the result in Figure 4 so you can see all the columns. There are no hidden containers.

The Docker Container Created by Running the App from Visual Studio 2017 Debug
Figure 4 The Docker Container Created by Running the App from Visual Studio 2017 Debug

Setting Up the Data API

Now let’s turn this into a data API using EF Core as the data persistence mechanism. The model is simplistic in order to focus on the containerization and its impact on your EF Core data source.

Begin by adding a class called Magazine.cs:

public class Magazine
{
  public int MagazineId { get; set; }
  public string Name { get; set; }
}

Next, you need to install three different NuGet packages. Because I’ll be showing you the difference between using a self-contained SQLite database and a SQL Server database, add both the Microsoft.EntityFrameworkCore.Sqlite and Microsoft.EntityFrameworkCore.SqlServer packages to the project. You’ll also be running EF Core migrations, so the third package to install is Microsoft.EntityFrameworkCore.Design.

Now I’ll let the tooling create a controller and DbContext for the API. In case this is new for you, here are the steps:

  • Right-click on the Controllers folder in Solution Explorer.
  • Choose Add | Controller | API Controller with actions, using Entity Framework.
  • Select the Magazine class as the Model class.
  • Click the plus sign next to Data context class and change the highlighted portion of the name to Mag, so it becomes [YourApp].Models.MagContext, and then click Add.
  • Leave the default controller name as MagazinesController.
  • Click Add.

When you’re done, you’ll have a new Data folder with the MagContext class and the Controllers folder will have a new MagazineController.cs file.

Now I’ll have EF Core seed the database with three magazines using the DbContext-based seeding I wrote about in my August 2018 column (msdn.com/magazine/mt829703). Add this method to MagContext.cs:

protected override void OnModelCreating(ModelBuilder modelBuilder)
{
  modelBuilder.Entity<Magazine>().HasData(
    new Magazine { MagazineId = 1, Name = "MSDN Magazine" },
    new Magazine { MagazineId = 2, Name = "Docker Magazine" },
    new Magazine { MagazineId = 3, Name = "EFCore Magazine" }
  );
 }

Setting Up the Database

To create the database, I need to specify the provider and connection string, and create and then run a migration. I want to start by going down a familiar path, so I’ll begin by targeting SQL Server LocalDB and specifying the connection string in the appsettings.json file.

When you open appsettings.json, you’ll find it already contains a connection string, which was created by the controller tooling when I let it define the MagContext file. Even though both SQL Server and SQLite providers were installed, it seems to have defaulted to the SQL Server provider. This proved to be true in subsequent tests. I prefer my own connection string name and my own database name, so, I replaced the MagContext connection string with MagsConnectionMssql, and added my preferred database name, DP0419Mags:

"ConnectionStrings": {
    "MagsConnectionMssql":
      "Server=(localdb)\\mssqllocaldb;Database=DP0419Mags;Trusted_Connection=True;"
  }

In the app’s startup.cs file, which includes a ConfigureServices method, the tooling also inserted code to configure the DbContext. Change its connection string name from MagContext to match the new name:

services.AddDbContext<MagContext>(options =>
  options.UseSqlServer(Configuration.GetConnectionString(  "MagsConnectionMssql")));

Now I can use EF Core migrations to create the first migration and, as I’m in Visual Studio, I can do that using PowerShell commands in the Package Manager Console:

add-migration initMssql

Migrating the Database

That command created a migration file, but I’m not going to create my database using the migration commands—when I deploy my app, I don’t want to have to execute migrations commands to create or update the database. Instead, I’ll use the EF Core Database.Migrate method. Where this logic method goes in your app is an important decision. You need it to run when the application starts up. A lot of people interpret this to mean when the startup.cs file is run, but the ASP.NET Core team recommends placing application startup code in the program.cs file, which is the true starting point of an ASP.NET Core app. But as with any decision, there may well be factors that affect this guidance.

The program’s default Main method calls the ASP.NET Core method, CreateWebHostBuilder, which performs a lot of tasks on your behalf, then calls two more methods—Build and Run:

public static void Main(string[] args)
{
  CreateWebHostBuilder(args).Build().Run();
}

I need to migrate the database after Build but before Run. To do this, I’ve created an extension method to read the service provider information defined in startup.cs, which will discover the DbContext configuration. Then the method calls Database.Migrate on the context. I adapted code (and guidance from EF Core team member, Brice Lambson) from the GitHub issue at bit.ly/2T19cbYto create the extension method for IWebHost shown in Figure 5. The method is designed to take a generic DbContext type.

Figure 5 Extension Method for IWebHost

public static IWebHost MigrateDatabase<T>(this IWebHost webHost) where T : DbContext
{
  using (var scope = webHost.Services.CreateScope())
  {
    var services = scope.ServiceProvider;
    try
    {
      var db = services.GetRequiredService<T>();
      db.Database.Migrate();
    }
    catch (Exception ex)
    {
      var logger = services.GetRequiredService<ILogger<Program>>();
      logger.LogError(ex, "An error occurred while migrating the database.");
    }
  }
  return webHost;
}

Then I modified the Main method to call MigrateDatabase for MagContext between Build and Run:

CreateWebHostBuilder(args).Build().MigrateDatabase<MagContext>().Run();

As you’re adding all of this new code, Visual Studio should prompt you to add using statements for Microsoft.EntityFrameworkCore, Microsoft.Extensions.DependencyInjection and the namespace for your MagContext class.

Now the database will get migrated (or even created) as needed at runtime.

One last step before debugging is to tell ASP.NET Core to point to the Magazines controller when starting, not the values controller. You can do that in the launchsettings.json file, changing the instances of launchUrl from api/values to api/Magazines.

Running the Data API in Kestrel, Then in Docker

As I did for the values controller, I’m going to start by testing this out in the self-hosted server via the project profile (for example, DataAPI), not the Docker profile. Because the database doesn’t yet exist, Migrate will create the new database, which means there will be a short delay because SQL Server, even LocalDB, has a lot of work to do. But the database does get created and seeded and then the default controller method reads and displays the three magazines in the browser at localhost:5000/api/Magazines.

Now let’s try it out with Docker. Change the Debug profile to Docker and run it again. Oh no!  When the browser opens, it displays a SQLException, with details explaining that TryGetConnection failed.

What’s going on here? The app is looking for the SQL Server (defined as “(localdb)\\mssqllocaldb” in the connection string) inside the running Docker container. But LocalDB is installed on my computer and not inside the container. Even though it’s a common choice for preparing for a SQL Server database when you’re in a development environment, it doesn’t work so easily when you’re targeting Docker containers.

This means I have more work to do—and you possibly have more questions. I certainly did.

A Detour to Easy Street

There are some great options, such as using SQL Server for Linux in another Docker container or targeting a SQL Azure database. I’ll be digging into those solutions in the next couple of articles, but first I want you to see a quick solution where the database server will indeed exist inside of the container and your API will run successfully. You can achieve this easily with SQLite, which is a self-contained database.

You should already have the Microsoft.EntityFramework.SQLite package installed. This NuGet package’s dependencies will force the SQLite runtime components to install in the image where the app is built.

Add a new connection string called MagsConnectionSqlite to the appsettings.json file. I’ve specified the file name as DP0419Mags.db:

"ConnectionStrings": {
    "MagsConnectionMssql":
      "Server=(localdb)\\mssqllocaldb;Database=DP0419Mags;Trusted_Connection=True;",
    "MagsConnectionSqlite": "Filename=DP0419Mags.db;"
  }

In Startup, change the DbContext provider to SQLite with the new connection string name:

services.AddDbContext<MagContext>(options =>
  options.UseSqlite(Configuration.GetConnectionString(  "MagsConnectionSqlite")));

The migration file you created is specific to the SQL Server provider, so you’ll need to replace that. Delete the entire Migrations folder, then run Add-Migration initSqlite in the Package Manager Console to recreate the folder along with the migration and snapshot files.

You can run this against the built-in server if you want to see the file that gets created, or you can just start debugging this in Docker. The new SQLite database gets created very quickly when the Migrate command is called and the browser then displays the three magazines again. Note that the IP address of the URL will match the one you saw earlier when running the values controller in Docker. In my case, that’s https://172.26.137.194/api/Magazines. So now the API and SQLite are both running inside the Docker container.

A More Production-Friendly Solution, Coming Soon

While using the SQLite database certainly simplifies the task of letting EF Core create a database inside the same container that’s running the app, this is most likely not how you’d want to deploy your API into production. One of the beauties of containers is that you can express separation of concerns by employing and coordinating multiple containers.

In the case of this tiny solution, perhaps SQLite would do the job. But for your real-world solutions, you should leverage other options. Focusing on SQL Server, one of those options would be to target an Azure SQL Database. With this option, regardless of where you’re running the app (on your development machine, in IIS, in a Docker container, in a Docker container in the cloud), you can be sure to always be pointing to a consistent database server or database depending on your requirements. Another path is to leverage containerized database servers, such as SQL Server for Linux, as I’ve written about in an earlier column (msdn.com/magazine/mt784660). Microservices introduces another layer of possible solutions given that the guidance is to have one database per microservice. You could easily manage those in containers, as well. There’s a great (and free) book from Microsoft on architecting .NET apps for containerized microservices at bit.ly/2NsfYBt.

In the next few columns, I’ll explore some of these solutions as I show you how to target SQL Azure or a containerized SQL Server; manage connection strings and protect credentials using Docker environment variables; and enable EF Core to discover connection strings at design time, using migrations commands and at run time from within the Docker container. Even with my pre-existing experience with Docker and EF Core, I went through a lot of learning curves working out the details of these solutions and am looking forward to sharing them all with you.


Julie Lerman is a Microsoft Regional Director, Microsoft MVP, software team coach and consultant who lives in the hills of Vermont. You can find her presenting on data access and other topics at user groups and conferences around the world. She blogs at the thedatafarm.com/blog and is the author of “Programming Entity Framework,” as well as a Code First and a DbContext edition, all from O’Reilly Media. Follow her on Twitter: @julielerman and see her Pluralsight courses at bit.ly/PS-Julie.

Thanks to the following Microsoft technical experts for reviewing this article: Glenn Condron, Steven Green, Mike Morton


Discuss this article in the MSDN Magazine forum