ASP.NET Core with Docker Container – Part 2 Dockerfile

In ASP.NET Core with Docker Container – Part 1 Mounted Volume, I covered my experiment with Mounted Volume. In this post, I will explore the possibilities of Dockerfile. By reading on the internet, watching courses on Pluralsight, I know what Dockerfile is. “Know” means, well, just heard about it, saw people playing with it … on videos. Just like the first part, I want to explore it, do something with it.

I want to ship my application in a container. I want to be able to take that container and run it on another machine.

Dockerfile

I found a good article about it here by digital ocean. I feel it is too much to get started. And for a beginner, like me, it is hard to understand.

Dockerfile is a set of instructions to build a Docker image.

So the next question is what are the fundamental instructions to build an image? I always like to start with fundamental.

Fundamental Instructions

FROM

Instruct the base image we will use to build up. Docker gives you the ability to build your own image base on existing one. Think about it for a moment. We can take an existing aspnetcore-build image (which is made and maintained by Microsoft), and build our own image with some custom features.

FROM microsoft/aspnetcore-build

That’s all it takes to build your own custom image with full power of the aspnetcore-build.

MAINTAINER

It is nice to tell the docker that you maintain the image. It is you, the owner, the maintainer.

MAINTAINER thaianhduc

 

COPY

Start making your own custom image. The “COPY” instruction will tell Docker to copy a folder into a specific folder inside the container. It is the same concept as Mounted Volume, except, instead of creating a mapping, it copies the data.

COPY . /app

The above command will tell the Docker to copy the working directory into “app” folder inside the container. The equivalent in Mounted Volume is “-v ${pwd}:/app“.

WORKDIR

The directory that container will start. Think the same way when we open a Powershell or CMD in Windows. In our web application, it will be the web directory that we created in the COPY instruction.

WORKDIR /app

 

RUN

Given that the working directory, the container will run commands instructed by RUN. In our previous web application, we want to run “dotnet restore” and “dotnet run” commands. They might look like this

RUN dotnet restore

RUN dotnet run

Note: I might not want to build application inside the container. Most of the time, I want to ship my application, not source code.

EXPOSE

The port that a container exposes to the outside.

EXPOSE 8080

Which means other systems can communicate with it via port 8080.

ENTRYPOINT

All the above instructions tell Docker how to build an image. The ENTRYPOINT instruction tells containers what command to run once it starts. Let’s think about a web application scenario, once a container starts, we want to start our web server inside the container.

ENTRYPOINT ["dotnet", "Coconut.Web.dll"]

In Action

With the discovery from Dockerfile, I reframe my steps more precisely. I will/want

  1. Build an ASPNET Core Web Application on my host machine with VS2017 Community.
  2. Publish it to a local folder.
  3. Build a Docker image with the published application in.
  4. Start a container from my custom image.
  5. Access my web application from host browser.

Before started, I want to make sure that I have a working website published. This should be easy done by asking VS2017 to publish my project to a local folder. By default, it will publish to bin\Release\PublishOutput folder.

Website up and run well

So given at the published folder, when run “dotnet Coconut.Web.dll“, my website is up and run. So far so good.

How about recap the Mounted Volume approach? Which means I want to run this command

Recap: Run app with mounted volume

At the last line, I want to start an ASPNETCORE container to host my published application. Shall it work? Let’s find out.

It looks ok. Except, I cannot access the website from host browser. Which means something was wrong; well not a surprise at all. Turn out I made the wrong port mapping. It should be port 80 inside Container instead of 8080.

docker run -t -p 5000:80 -v ${pwd}:/app -w /app microsoft/aspnetcore bash -c "dotnet Coconut.Web.dll"

Create Dockerfile

A Dockerfile is a simple text file. You can create it with any your favorite editor.

# Custom image based on aspnetcore to host a web application
FROM         microsoft/aspnetcore
MAINTAINER   Thai Anh Duc
COPY         . /app
WORKDIR      /app
ENTRYPOINT   ["dotnet", "Coconut.Web.dll"]

For naming convention, I name it Dockerfile. I use VS2017.

Build a Custom Image

So far I have created a published application and a Dockerfile. Things seem ready. Next, we need to tell Docker to build an image from the Dockerfile.

The current folder structure looks like this

\bin
    \Release
            \PublishOutput
Dockerfile

Let’s try to run my very first build command. However, it failed right after build. Because that folder structure will cause some security issues. To make my life easy, I move the Dockerfile into the PublishOutput folder.

docker build -t thaianhduc/aspnetcore:0.1 .

Hey Docker, take the default Dockerfile at the current location (where the command is run), build an image, give it a name “thaianhduc/aspnetcore” and tag it with version “0.1”.

Do not forget the “.” at the end. It represents for “current location“.

// Run the command to list all the images. The new image should be there
docker images

Run It

Time to enjoy the fruit

docker run --name my-coconut -p 5000:80 thaianhduc/aspnetcore:0.1

Open the browser on my host machine, type in http://localhost:5000/hello_beautiful. Oh yeah! It works.

A Few Words

I have failed many times while experimenting and writing the post. Everything seems easy while you read it on the internet. However, when it gets down to actual do it. Evil hits your face at the first try. But that is actually a good thing. I have learned so much while trying to solve those problems. Some of them I understand, some I do not.

So far I have been playing around with the fundamentals of Docker. For every single command, for every single action, I try to explain to myself in my own language. The language that I can understand and explain to other.

Having small goals at the beginning keeps me focus. It allows me to temporary ignore some problems. Otherwise, you will wonder in the sea of information out there. Reading something is easy. Making something works for you is hard.

How do I take my custom image and use on other machines? Easy. I can push my image to Docker hub. Then I can run it just like other images. Things make sense now. I can resonate without effort.

Deployment Options Recap

I started with learning ASP.NET Core and other new cool stuff. I want to learn about the infrastructure that I can take advantages of. Let’s assume that I have a web application, how many options do I have to deploy it?

  1. Local host (IIS): Obviously I can simply press F5
  2. Deploy to Azure for free.
  3. Run it inside a container using Mounted Volume approach.
  4. Or deploy it as a Docker Image.

What’s Next?

I want to step further into the ASP.NET Core, from a fundamental point of view. Maybe later in the series, explore the possibilities of Docker in term of Microservices.

That would be fun!

 

ASP.NET Core with Docker Container – Part 1 Mounted Volume

One step further into Docker world, the biggest question I have to ask myself is “where do I want to go?“. What do I know about Docker? How could I explain in a simple sentence?

As far as I’ve known so far, Docker gives me a box (container), in which I can run something in an isolated environment. When I do not need it, I might throw it away. And when I need it, I can build it quickly.

MeWithDocker
I have a box. What should I do with it?

Given that I have a box, and I can open the box, what will I put in? what should I do with the box?

As a developer, the first thing I want to try is to put some code inside the box and be able to run it. Yes! That’s right! I will put ASP.NET Core web application code into the box, and run it.

In this post, I will go through all the things I have learned, collected so far and put them into practice.

Hey, Boss! How Do I Get The Source, Docker Asked?

Calm down! Calm down! Docker. Let’s me show you where to get the stuff you need.

External – Mounted Volume

Docker can refer to the source from an external (relative to itself) system. In a development environment, it is our host system, the folder where we put our code.

DockerConnectToExternalSource
Connect to external source, a folder on host machine

I will install a container that specializes for .NET Core. Should be easy, just head over to Docker hub, find Microsoft. Take the microsoft/aspnetcore image. Time to type some fun commands with Docker in Powershell (I started to love CLI interface). Because I am going to use it for a while, I will pull aspnetcore image from the registry.

docker pull microsoft/aspnetcore hit enter. Boom! you have the image ready to serve you.

DockerApsnetCore
Docker Aspnet Core Image

As a good habit should run the docker images and docker inspect to look into the container. Just to make sure everything is in place.

To connect, Docker introduces the Data Volume concept. You can read the full document at the website. However, I would like to go with the basic approach.

Think of Data Volume is a mean define a mapping between a location inside the Container and a location on the host machine.

We have

  1. A Docker container that can host and run a website.
  2. A folder where we develop our source code.

And Data Volume allows a running container access that folder. When it finishes, the folder is still intact. All the changes that the container made, persists there. Let’s try out.

I want to run a container that will start my website. Breaking down to small steps, when running a container, it must

  1. Start the container instance in background mode
  2. Connect to the folder where my code hosted (local drive)
  3. Start a Kestrel server to run my application

While experience, it seems I need to use the aspnetcore-build image instead. Because I need the Core SDK to run and build my application. Jump over the Powershell, I run my very first command with a smile. Boom! it does not work

Oops! Not work boy

Let’s try to make it worked first and then I will explain every single part in the command.

The best way we should do when investing an issue is to look into it, docker inspect command is your friend.

Docker inspects the problem

Something is wrong with the volume mapping.

After some investigations, the correct syntax that I should use

docker run --name coconut -i -p 5000:80 -v ${pwd}:/app -w /app microsoft/aspnetcore-build bash -c "dotnet restore && dotnet run"

In term of speaking language, the above command says

I want to run a container from the microsoft/aspnetcore-build image, named it as coconut. Once started, create a volume mapping between the current host folder (via ${pwd} keyword) and app folder in the container, create a port mapping between port 5000 (host machine) and port 80 (on the container). The finally runs the command dotnet run with under working folder app.

Just like before, I hit another error, saying that I should share the drive with the Docker. The solution is written here.

I must admit that thing is not easy as it seems. After struggling for a while, I finally manage to get it worked.

Pay attentions to

bash -c “dotnet restore && dotnet run”

The command instructs the container to restore NuGet packages and then run the application.

And

-p 5000:80

Once the application is started in the container, the website is exposed via port 80. Note: if you start an application on your host machine, it is exposed via port 5000. That is also why I take the port 5000 on the host machine.

The final result, a website running in a container

Initially, I wanted to write both the data mounted volume and Dockerfile in one post. However, the more I get my hands dirty, the more I learn. It is better to recap what I reap.

Recap

To run a container that starts an aspnet core application, uses this command

docker run --name coconut -i -p 5000:80 -v ${pwd}:/app -w /app microsoft/aspnetcore-build bash -c "dotnet restore && dotnet run"

The port mapping (-p 5000:80) is very important. It allows you to access your website from the host browser. To know the port that dotnet exposes after running the website, you should look at the output in the container console.

To map your current working directory to a folder in the container, use -v ${pwd}:/app. The ${pwd} on Powershell will get the current directory.

Because I made many mistakes, I learn to use the docker stop/start/rm commands. Now they are mine 🙂

Evil is everywhere. You should use docker inspect to look inside the container.

One of the biggest achievements is the feeling of getting started, of getting into. I am able to use many commands at my disposal.

What’s Next?

Finish what I have started – Use Dockerfile to build and ship my applications.

Updated: Part 2 ASP.NET Core with Docker Container – Part 2 Dockerfile.

Docker, What Does It Mean to Me?

Now that I finished the flow to push the code from my laptop up to the Cloud. I also started to play around with Docker; also a bit of ASP.NET Core. I have many dots. The next challenge, as always, is to connect them. Try to make sense from all pieces. Let’s focus a bit more on Docker; create a connection between Docker and ASP.NET Core.

There are many materials about Docker. It is easy to start playing around with Docker, as I wrote here. Pretty basic stuff. It is easy to find out how to run a command. The Docker website has an extensive documentation, which tells you everything you need to know about Docker.

But, to me, It is useless if I cannot make sense of “the why”. After playing around with commands, watching some courses (mostly from Pluralsight), I have to find the answer for questions

  1. Why do I need it?
  2. What problems does it solve?

I try to create an image of it in my mind. It is such an important step, that if I fail, I cannot continue. Just like, how can you run if you do not know why and where to run!

So let give it a try before moving deeper into the implementation detail.

Why Docker? Container Approach

I would like to call it “Container Approach“. Below are some benefits, in my opinion, that give me good reasons to invest in.

Improve Cooperation

If we develop small systems which are easy to setup and deploy, the need of Container might not be obvious. Because, as a developer, we can check out the code, hit F5 and we are good to go. If we want to deploy those systems to QA’s machine or some sort of central servers, it is still a trivial task.

However, when systems are big where there are many services, require complex infrastructure, things get complicated quickly and cost lots of time. Take an example of a web application. I would assume the system consists of

  1. Front end application: Build on top of AngularJS. This application will connect to a backend service via WebAPI endpoints.
  2. Backend service: The backend service built with ASP.NET Core. It supplies endpoints that Front end application will consume.
  3. Database: SQL family server. Can be SQL Server, SQL Express, SQLight, or MySQL. Use EF as Data Access Layer (ORM).

The team consists of Frontend Developer, Backend Developer, and Tester. Each might have to install the same infrastructure to run the application. Given that we have not had a central deployment place yet. It means that

  1. Front end developer has to install all requirement software, infrastructure, that only Backend Developer required.
  2. Back end developer has to install the things that only Front end developer needs.
  3. Tester has to install development environment even they do not know about coding. They might have to setup local IIS.

When a new member joins the team, there are so many repetitive works.

What if

  1. Front end developer will take the api component, and load it up on his machine without installing anything. He will focus on building the cool frontend, and connect to that API.
  2. A tester will take the application as a whole, and run it with a single click, without installing anything.

With a proper developed, Docker can help.

Develop Faster, Test Faster

Docker will give us the minimum-required infrastructure we need to build and test an application. For example, when I want to develop a system running with MongoDB. Instead of downloading and installing MondoDB locally, I can get a prebuilt docker image with MongoDB installed.

The ability to switch infrastructure gives developers a lot of flexibilities.

Develop Componentized Thinking

This might not be true for others. However, it works for me. I shape my thinking process. A system composes of many small components. Each component might run in a Container. This, in turn, will force me to think, what should I put in a single container.

What’s Next?

I try to make sense of Docker, try to explain it to me. I do not make any suggestion or judgment regarding its benefits. But will all those in mind, I am ready to invest more in the journey.

Docker helps

  1. Improve cooperation
  2. Develop faster, test faster
  3. Develop componentized thinking

The next step is, as usual, to try things out. Connect the dots between Docker and ASP.NET Core. This post intended to write about it. However, I changed the intention as I wrote.

Docker, Every Basic Things I Need To Know

Having started the learning journey with Docker, I took some courses on Pluralsight, Nigel Poulton Docker Deep Dive Course. It is a wonderful course. It helps me understand the overall design, principles of the Docker. It also empowers me with many hands-on commands that can use immediately. I did, I practiced them while watching the course.

Suddenly, I asked myself. Hey, wait a minute! You will soon forget them all. Well, because you will not use them in the near future, or at least you will not know when you will them. I convinced myself that there is a document in the Docker office website. We are all, as a human being, having many good reasons to convince ourselves not doing something. I realize that I was set up a trap for myself.

With the exposing of the information age, the problem is not lacking information. Rather it is a problem of how to get started. Take an example of the Docker, head over to the document site, then what will you do first? Many people will know where to start. But at the same time, there is a vast majority of not.

Welcome to the fundamental of Docker!

By writing them out here, I, later, can know what to look for in detail. Instead of checking all the document, I simply look for the detail of specific commands, which is a very limited amount.

Docker Commands

A list of basic commands that I should need to know to work with

docker version

See client and server (daemon) version information

docker info

docker run

Run a docker image

docker pull

Pull a docker image from docker hub to local environment

docker push

Push a local docker image to the docker hub

docker build

Build a docker image from a Dockerfile file.

docker images

Display all images

docker history {image name/id}

See history of an image

docker ps

List containers

docker inspect

Inspect container. Very useful to dig deeper into a container

docker port {container name}

Display the port mapping between container and host

docker rmi {image id}

Remove an image

docker start {container id}

Start a container

docker stop {container id}

Stop a container

docker rm {container id}

Remove a container

docker attach {container id}

Attach (or interact) with a container

docker logs -f

Attach to a running container, which means we can interact with the container shell, if it is a Linux

Docker Networking

[List of things I learn through the course.]

docker0 bridge (Ethernet switch)

On host machine (Linux), install: apt-get install bridge-utils

command brctl show docker0.

icc (Inter Container Communication) and iptables: both are true by default

Next?

Learn what, how to take advantages of Docker in the modern software development.

What you do with what you know is more important than what you know

 

 

 

Hi, Docker and Ubuntu

I was watching #MSBuild2017, looking over the balcony, up to the cloud, a voice whisper in my head, Oh man, you are far behind. What a moment! So many new things, I decided to pick Docker as a starting point. Honestly, I had no clue where to start. However, I did know that I had to start somewhere. A wise man said start small. Docker has just that, a hello world application.

Whenever I want to learn something, I head over to Pluralsight. Here we go. Give me Docker

Pluralsight Give Me Docker
Pluralsight Give Me Docker

I decided to take the Getting Started with Docker, then later Docker Deep Dive (still enjoying this course). They are awesome. The author, Nigel Poulton, has a good sense of humor.

Docker on Windows

I have been using Windows since I first met a computer. It is fairly simple to run and try out docker on Windows. Head over to Docker on Windows, download and follow the instruction.

After installation completed, fire up PowerShell, your very first command is docker version.

Your very first docker command

Have not used Command Line Interface for years, that output gave me a strong feeling. I uttered “wow it is cool“.

You cannot imagine my feeling when I ran this command

Hello World from Docker

From that moment, I know that I can do many things with the abilities Docker gives.

Docker on Linux

Here is another fact. I do not know Linux. I have not used any Linux system. Years ago, it was hard to setup a Linux lab. What is now?

Hyper-V The power of virtualization

With the power of Hyper-V, I can have as many machines as I wish. I decided to download and try with Ubuntu server (The course suggested it as well).

Installing Ubuntu on Hyper-V is simple. Download Ubuntu Server here. Hello Ubuntu.

Hello Ubuntu

Let try docker

Hello World on Ubuntu

Such an amazing moment 🙂 They are exactly the same in both versions.

Values?

Sound a trivial task. However, there are many things that I gained during the weekend.

  1. It triggers my learn process. I feel energized again after years of writing code.
  2. It opens opportunities. The more I know, the better I am. The better I am, the more opportunities I see.

What I showed here is the end result. Just like anything else in the world, the devil is at the detail. When you actually get your hands dirty, you will meet the roadblocks. Solving them gives me some Linux pieces.

Some Linux (Ubuntu Server) Command Line I learned

sudo su: Run as administrator (mapped to Windows environment).

ifconfig: Same as ipconfig in Windows.

ip addr show: Display the IP addresses with other information

route -n: Display kernel routing table.

apt-get install [name]: Install a package. For example: apt-get install docker.io will install docker on your Ubuntu

apt-get update: Install all updates.

service [name] status: View status of a service [name]

Not a bad result for my weekend. Not mention that I, now, love the Command Line approach.

 

So far so good. It is enough for me to move on. My next challenge is how to take advantages of them. It would be fun.