How to Build a Node.js Application with Docker

How to Build a Node.js Application with Docker

Interested in Node.js but not sure where to start with Docker? This tutorial will walk you through creating an application image for a static website that uses the Express framework and Bootstrap. You will then build a container using that image, push it to Docker Hub, and use it to build another container, demonstrating how you can recreate and scale your application.

Introduction

This tutorial will walk you through creating an application image for a static website that uses the Express framework and Bootstrap. You will then build a container using that image, push it to Docker Hub, and use it to build another container, demonstrating how you can recreate and scale your application.

Prerequisites

To follow this tutorial, you will need:

  • A sudo user on your server or in your local environment.
  • Docker.
  • Node.js and npm.
  • A Docker Hub account.
Step 1 — Installing Your Application Dependencies

First, create a directory for your project in your non-root user’s home directory:

mkdir node_project

Navigate to this directory:

cd node_project

This will be the root directory of the project.

Next, create a package.json with your project’s dependencies:

nano package.json

Add the following information about the project to the file; be sure to replace the author information with your own name and contact details:

{
  "name": "nodejs-image-demo",
  "version": "1.0.0",
  "description": "nodejs image demo",
  "author": "Sammy the Shark <[email protected]>",
  "license": "MIT",
  "main": "app.js",
  "scripts": {
    "start": "node app.js",
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "keywords": [
    "nodejs",
    "bootstrap",
    "express"
  ],
  "dependencies": {
    "express": "^4.16.4"
  }
}

Install your project’s dependencies:

npm install

Step 2 — Creating the Application Files

We will create a website that offers users information about sharks.

Open app.js in the main project directory to define the project’s routes:

nano app.js

Add the following content to the file to create the Express application and Router objects, define the base directory, port, and host as variables, set the routes, and mount the router middleware along with the application’s static assets:

var express = require("express");
var app = express();
var router = express.Router();

var path = __dirname + '/views/';

// Constants
const PORT = 8080;
const HOST = '0.0.0.0';

router.use(function (req,res,next) {
  console.log("/" + req.method);
  next();
});

router.get("/",function(req,res){
  res.sendFile(path + "index.html");
});

router.get("/sharks",function(req,res){
  res.sendFile(path + "sharks.html");
});

app.use(express.static(path));
app.use("/", router);

app.listen(8080, function () {
  console.log('Example app listening on port 8080!')
})

Next, let’s add some static content to the application. Create the views directory:

mkdir views

Open index.html:

nano views/index.html

Add the following code to the file, which will import Boostrap and create a jumbotron component with a link to the more detailed sharks.html info page:

<!DOCTYPE html>
<html lang="en">
   <head>
      <title>About Sharks</title>
      <meta charset="utf-8">
      <meta name="viewport" content="width=device-width, initial-scale=1">
      <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css">
      <link href="css/styles.css" rel="stylesheet">
      <link href='https://fonts.googleapis.com/css?family=Merriweather:400,700' rel='stylesheet' type='text/css'>
      <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
      <script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/js/bootstrap.min.js"></script>
   </head>
   <body>
      <nav class="navbar navbar-inverse navbar-static-top">
         <div class="container">
            <div class="navbar-header">
               <button type="button" class="navbar-toggle collapsed" data-toggle="collapse" data-target="#bs-example-navbar-collapse-1" aria-expanded="false">
               <span class="sr-only">Toggle navigation</span>
               <span class="icon-bar"></span>
               <span class="icon-bar"></span>
               <span class="icon-bar"></span>
               </button>
               <a class="navbar-brand" href="#">Everything Sharks</a>
            </div>
            <div class="collapse navbar-collapse" id="bs-example-navbar-collapse-1">
               <ul class="nav navbar-nav mr-auto">
                  <li class="active"><a href="/">Home</a></li>
                  <li><a href="/sharks">Sharks</a></li>
               </ul>
            </div>
         </div>
      </nav>
      <div class="jumbotron">
         <div class="container">
            <h1>Want to Learn About Sharks?</h1>
            <p>Are you ready to learn about sharks?</p>
            <br>
            <p><a class="btn btn-primary btn-lg" href="/sharks" role="button">Get Shark Info</a></p>
         </div>
      </div>
      <div class="container">
         <div class="row">
            <div class="col-md-6">
               <h3>Not all sharks are alike</h3>
               <p>Though some are dangerous, sharks generally do not attack humans. Out of the 500 species known to researchers, only 30 have been known to attack humans.</p>
            </div>
            <div class="col-md-6">
               <h3>Sharks are ancient</h3>
               <p>There is evidence to suggest that sharks lived up to 400 million years ago.</p>
            </div>
         </div>
      </div>
   </body>
</html>

Next, open a file called sharks.html:

nano views/sharks.html

Add the following code, which imports Bootstrap and the custom style sheet and offers users detailed information about certain sharks:

<!DOCTYPE html>
<html lang="en">
   <head>
      <title>About Sharks</title>
      <meta charset="utf-8">
      <meta name="viewport" content="width=device-width, initial-scale=1">
      <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css">
      <link href="css/styles.css" rel="stylesheet">
      <link href='https://fonts.googleapis.com/css?family=Merriweather:400,700' rel='stylesheet' type='text/css'>
      <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
      <script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/js/bootstrap.min.js"></script>
   </head>
   <nav class="navbar navbar-inverse navbar-static-top">
      <div class="container">
         <div class="navbar-header">
            <button type="button" class="navbar-toggle collapsed" data-toggle="collapse" data-target="#bs-example-navbar-collapse-1" aria-expanded="false">
            <span class="sr-only">Toggle navigation</span>
            <span class="icon-bar"></span>
            <span class="icon-bar"></span>
            <span class="icon-bar"></span>
            </button>
            <a class="navbar-brand" href="#">Everything Sharks</a>
         </div>
         <div class="collapse navbar-collapse" id="bs-example-navbar-collapse-1">
            <ul class="nav navbar-nav mr-auto">
               <li><a href="/">Home</a></li>
               <li class="active"><a href="/sharks">Sharks</a></li>
            </ul>
         </div>
      </div>
   </nav>
   <div class="jumbotron text-center">
      <h1>Shark Info</h1>
   </div>
   <div class="container">
      <div class="row">
         <div class="col-md-6">
            <p>
            <div class="caption">Some sharks are known to be dangerous to humans, though many more are not. The sawshark, for example, is not considered a threat to humans.</div>
            <img src="https://assets.digitalocean.com/articles/docker_node_image/sawshark.jpg" alt="Sawshark">
            </p>
         </div>
         <div class="col-md-6">
            <p>
            <div class="caption">Other sharks are known to be friendly and welcoming!</div>
            <img src="https://assets.digitalocean.com/articles/docker_node_image/sammy.png" alt="Sammy the Shark">
            </p>
         </div>
      </div>
    </div>
   </body>
</html>

Finally, create the custom CSS style sheet that you’ve linked to in index.html and sharks.html by first creating a css folder in the views directory:

mkdir views/css

Open the style sheet and add the following code, which will set the desired color and font for our pages:

.navbar {
        margin-bottom: 0;
}

body {
        background: #020A1B;
        color: #ffffff;
        font-family: 'Merriweather', sans-serif;
}
h1,
h2 {
        font-weight: bold;
}
p {
        font-size: 16px;
        color: #ffffff;
}

.jumbotron {
        background: #0048CD;
        color: white;
        text-align: center;
}
.jumbotron p {
        color: white;
        font-size: 26px;
}

.btn-primary {
        color: #fff;
        text-color: #000000;
        border-color: white;
        margin-bottom: 5px;
}

img, video, audio {
        margin-top: 20px;
        max-width: 80%;
}

div.caption: {
        float: left;
        clear: both;
}

Start the application:

npm start

Navigate your browser to http://your_server_ip:8080 or localhost:8080 if you are working locally. You will see the following landing page:

Click on the Get Shark Info button. You will see the following information page:

You now have an application up and running. When you are ready, quit the server by typing CTRL+C.

Step 3 — Writing the Dockerfile

In your project’s root directory, create the Dockerfile:

nano Dockerfile

Add the following code to the file:

~/node_project/Dockerfile

FROM node:10

RUN mkdir -p /home/node/app/node_modules && chown -R node:node /home/node/app

WORKDIR /home/node/app

COPY package*.json ./

RUN npm install

COPY . .

COPY --chown=node:node . .

USER node

EXPOSE 8080

CMD [ "npm", "start" ]

This Dockerfile uses an alpine base image and ensures that application files are owned by the non-root node user that is provided by default by the Docker Node image.

Next, add your local node modules, npm logs, Dockerfile, and .dockerignore to your .dockerignore file:

node_modules
npm-debug.log
Dockerfile
.dockerignore

Build the application image using the docker build command:

docker build -t your_dockerhub_username/nodejs-image-demo .

The . specifies that the build context is the current directory.

Check your images:

docker images

You will see the following output:

OutputREPOSITORY                                         TAG                 IMAGE ID            CREATED             SIZE
your_dockerhub_username/nodejs-image-demo          latest              1c723fb2ef12        8 seconds ago       895MB
node                                               10                  f09e7c96b6de        17 hours ago        893MB

Run the following command to build a container using this image:

docker run --name nodejs-image-demo -p 80:8080 -d your_dockerhub_username/nodejs-image-demo 

Inspect the list of your running containers with docker ps:

docker ps

You will see the following output:

OutputCONTAINER ID        IMAGE                                                   COMMAND             CREATED             STATUS              PORTS                  NAMES
e50ad27074a7        your_dockerhub_username/nodejs-image-demo               "npm start"         8 seconds ago       Up 7 seconds        0.0.0.0:80->8080/tcp   nodejs-image-demo

With your container running, you can now visit your application by navigating your browser to http://your_server_ip or localhost. You will see your application landing page once again:

Now that you have created an image for your application, you can push it to Docker Hub for future use.

Step 4 — Using a Repository to Work with Images

The first step to pushing the image is to log in to the your Docker Hub account:

docker login -u your_dockerhub_username -p your_dockerhub_password

Logging in this way will create a ~/.docker/config.json file in your user’s home directory with your Docker Hub credentials.

Push your image up using your own username in place of <span class="highlight">your_dockerhub_username</span>:

docker push your_dockerhub_username/nodejs-image-demo

If you would like, you can test the utility of the image registry by destroying your current application container and image and rebuilding them.

First, list your running containers:

docker ps

You will see the following output:

OutputCONTAINER ID        IMAGE                                       COMMAND             CREATED             STATUS              PORTS                  NAMES
e50ad27074a7        your_dockerhub_username/nodejs-image-demo   "npm start"         3 minutes ago       Up 3 minutes        0.0.0.0:80->8080/tcp   nodejs-image-demo

Using the CONTAINER ID listed in your output, stop the running application container. Be sure to replace the highlighted ID below with your own CONTAINER ID:

docker stop e50ad27074a7

List your all of your images with the -a flag:

docker images -a

You will see the following output with the name of your image, your_dockerhub_username/nodejs-image-demo, along with the node image and the other images from your build:

OutputREPOSITORY                                           TAG                 IMAGE ID            CREATED             SIZE
your_dockerhub_username/nodejs-image-demo            latest              1c723fb2ef12        7 minutes ago       895MB
<none>                                               <none>              e039d1b9a6a0        7 minutes ago       895MB
<none>                                               <none>              dfa98908c5d1        7 minutes ago       895MB
<none>                                               <none>              b9a714435a86        7 minutes ago       895MB
<none>                                               <none>              51de3ed7e944        7 minutes ago       895MB
<none>                                               <none>              5228d6c3b480        7 minutes ago       895MB
<none>                                               <none>              833b622e5492        8 minutes ago       893MB
<none>                                               <none>              5c47cc4725f1        8 minutes ago       893MB
<none>                                               <none>              5386324d89fb        8 minutes ago       893MB
<none>                                               <none>              631661025e2d        8 minutes ago       893MB
node                                                 10                  f09e7c96b6de        17 hours ago        893MB

Remove the stopped container and all of the images, including unused or dangling images, with the following command:

docker system prune -a

With all of your images and containers deleted, you can now pull the application image from Docker Hub:

docker pull your_dockerhub_username/nodejs-image-demo

List your images once again:

docker images

You will see your application image:

OutputREPOSITORY                                     TAG                 IMAGE ID            CREATED             SIZE
your_dockerhub_username/nodejs-image-demo      latest              1c723fb2ef12        11 minutes ago      895MB

You can now rebuild your container using the command from Step 3:

docker run --name nodejs-image-demo -p 80:8080 -d your_dockerhub_username/nodejs-image-demo

List your running containers:

docker ps

OutputCONTAINER ID        IMAGE                                                   COMMAND             CREATED             STATUS              PORTS                  NAMES
f6bc2f50dff6        your_dockerhub_username/nodejs-image-demo               "npm start"         4 seconds ago       Up 3 seconds        0.0.0.0:80->8080/tcp   nodejs-image-demo

Visit http://your_dockerhub_username/nodejs-image-demo or localhost once again to view your running application.

Node.js for Beginners - Learn Node.js from Scratch (Step by Step)

Node.js for Beginners - Learn Node.js from Scratch (Step by Step)

Node.js for Beginners - Learn Node.js from Scratch (Step by Step) - Learn the basics of Node.js. This Node.js tutorial will guide you step by step so that you will learn basics and theory of every part. Learn to use Node.js like a professional. You’ll learn: Basic Of Node, Modules, NPM In Node, Event, Email, Uploading File, Advance Of Node.

Node.js for Beginners

Learn Node.js from Scratch (Step by Step)

Welcome to my course "Node.js for Beginners - Learn Node.js from Scratch". This course will guide you step by step so that you will learn basics and theory of every part. This course contain hands on example so that you can understand coding in Node.js better. If you have no previous knowledge or experience in Node.js, you will like that the course begins with Node.js basics. otherwise if you have few experience in programming in Node.js, this course can help you learn some new information . This course contain hands on practical examples without neglecting theory and basics. Learn to use Node.js like a professional. This comprehensive course will allow to work on the real world as an expert!
What you’ll learn:

  • Basic Of Node
  • Modules
  • NPM In Node
  • Event
  • Email
  • Uploading File
  • Advance Of Node

Docker Best Practices for Node Developers

Docker Best Practices for Node Developers

Welcome to the "Docker Best Practices for Node Developers"! With your basic knowledge of Docker and Node.js in hand, Docker Mastery for Node.js is a course for anyone on the Node.js path. This course will help you master them together.

Welcome to the best course on the planet for using Docker with Node.js! With your basic knowledge of Docker and Node.js in hand, Docker Mastery for Node.js is a course for anyone on the Node.js path. This course will help you master them together.

My talk on all the best of Docker for Node.js developers and DevOps dealing with Node apps. From DockerCon 2019. Get the full 9-hour training course with my coupon at http://bit.ly/365ogba

Get the source code for this talk at https://github.com/BretFisher/dockercon19

Some of the many cool things you'll do in this course
  • Build Node.js Images that auto-scan for security vulnerabilities
  • Use Docker's cutting-edge BuildKit with SSH Agents and NPM Caches for better image building
  • Use docker-compose with Visual Studio Code for full Node.js debug support
  • Use BuildKit and Multi-stage Builds to create minimal and flexible Dockerfiles
  • Build custom Node.js images using distro's like CentOS and Alpine
  • Test Docker init, tini, and Node.js as a PID 1 process in containers
  • Create Node.js apps that properly startup and respond to healthchecks
  • Develop ARM based Node.js apps with Docker Desktop, and deploy to AWS A1 Servers
  • Build graceful shutdown code into your apps for zero-downtime deploys
  • Dig into HTTP connections with orchestration, and how Proxies can help
  • Study examples of Docker Swarm and Kubernetes deployments for Node.js
  • Spend time Migrating traditional (legacy) Node.js apps into containers
  • Simplify your microservice solutions with advanced Docker Compose features
What you will learn in this course

You'll start with a quick review about getting set up with Docker, as well as Docker Compose basics. That way we're on the same page for the basics.

Then you'll jump into Node.js Dockerfile basics, that way you'll have a good Dockerfile foundation for new features we'll add throughout the course.

You'll be building on all the different things you learn from each Lecture in the course. Once you have the basics down of Compose, Dockerfile, and Docker Image, then you'll focus on nuances like how Docker and Linux control the Node process and how Docker changes that to make sure you know what options there are for starting up and shutting down Node.js and the right way to do it in different scenarios.

We'll cover advanced, newer features around making the Dockerfile the most efficient and flexible as possible using things like BuildKit and Multi-stage.

Then we'll talk about distributed computing and cloud design to ensure your Node.js apps have 12-factor design in your containers, as well as learning how to migrate old apps into this new way of doing things.

Next we cover Compose and its awesome features to get really efficient local development and test set-up using the Docker Compose command line and Docker Compose YAML file.

With all this knowledge, you'll progress to production concerns and making images production-ready.

Then we'll jump into deploying those containers and running them in production. Whether you use Docker Engine or orchestration with Kubernetes or Swarm, I've got you covered. In addition, we'll cover HTTP connections and reverse proxies for connection handling and routing with multi-container systems.

Lastly, you'll get a final, big assignment where you'll be building and deploying a large, complex solution, including multiple Node.js containers that are doing different things. You'll build Docker images, Dockerfiles, and compose files, and deploy them to a server to test. You'll need to check whether connections failover properly. You'll basically take everything you've learned and apply it in one big project!

Crafting multi-stage builds with Docker in Node.js

Crafting multi-stage builds with Docker in Node.js

Learn how you can use a multi-stage Docker build for your Node.js application. Docker multi-stage builds enable us to create more complex build pipelines without having to resort to magic tricks.

Everyone knows about Docker. It’s the ubiquitous tool for packaging and distribution of applications that seemed to come from nowhere and take over our industry! If you are reading this, it means you already understand the basics of Docker and are now looking to create a more complex build pipeline.

In the past, optimizing our Docker images has been a challenging experience. All sorts of magic tricks were employed to reduce the size of our applications before they went to production. Things are different now because support for multi-stage builds has been added to Docker.

In this post, we explore how you can use a multi-stage build for your Node.js application. For an example, we’ll use a TypeScript build process, but the same kind of thing will work for any build pipeline. So even if you’d prefer to use Babel, or maybe you need to build a React client, then a Docker multi-stage build can work for you as well.

A basic, single-stage Dockerfile for Node.js

Let’s start by looking at a basic Dockerfile for Node.js. We can visualize the normal Docker build process as shown in Figure 1 below.

Figure 1: Normal Docker build process.

We use the docker build command to turn our Dockerfile into a Docker image. We then use the docker run command to instantiate our image to a Docker container.

The Dockerfile in Listing 1 below is just a standard, run-of-the-mill Dockerfile for Node.js. You have probably seen this kind of thing before. All we are doing here is copying the package.json, installing production dependencies, copying the source code, and finally starting the application.

This Dockerfile is for regular JavaScript applications, so we don’t need a build process yet. I’m only showing you this simple Dockerfile so you can compare it to the multi-stage Dockerfile I’ll be showing you soon.

Listing 1: A run-of-the-mill Dockerfile for Node.js

FROM node:10.15.2

WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install --only=production
COPY ./src ./src
EXPOSE 3000
CMD npm start

Listing 1 is a quite ordinary-looking Docker file. In fact, all Docker files looked pretty much like this before multi-stage builds were introduced. Now that Docker supports multi-stage builds, we can visualize our simple Dockerfile as the single-stage build process illustrated in Figure 2.


Figure 2: A single-stage build pipeline.

The need for multiple stages

We can already run whatever commands we want in the Dockerfile when building our image, so why do we even need a multi-stage build?

To find out why, let’s upgrade our simple Dockerfile to include a TypeScript build process. Listing 2 shows the upgraded Dockerfile. I’ve bolded the updated lines so you can easily pick them out.

Listing 2: We have upgraded our simple Dockerfile to include a TypeScript build process

FROM node:10.15.2

WORKDIR /usr/src/app
COPY package*.json ./
COPY tsconfig.json ./
RUN npm install
COPY ./src ./src
RUN npm run build
EXPOSE 80
CMD npm start

We can easily and directly see the problem this causes. To see it for yourself, you should instantiate a container from this image and then shell into it and inspect its file system.

I did this and used the Linux tree command to list all the directories and files in the container. You can see the result in Figure 3.

Notice that we have unwittingly included in our production image all the debris of development and the build process. This includes our original TypeScript source code (which we don’t use in production), the TypeScript compiler itself (which, again, we don’t use in production), plus any other dev dependencies we might have installed into our Node.js project.


FIgure 3: The debris from development and the build process is bloating our production Docker image.
Bear in mind this is only a trivial project, so we aren’t actually seeing too much cruft left in our production image. But you can imagine how bad this would be for a real application with many sources files, many dev dependencies, and a more complex build process that generates temporary files!

We don’t want this extra bloat in production. The extra size makes our containers bigger. When our containers are bigger than needed, it means we aren’t making efficient use of our resources. The increased surface area of the container can also be a problem for security, where we generally prefer to minimize the attackable surface area of our application.

Wouldn’t it be nice if we could throw away the files we don’t want and just keep the ones we do want? This is exactly what a Docker multi-stage build can do for us.

Crafting a Dockerfile with a multi-stage build

We are going to split out Dockerfile into two stages. Figure 4 shows what our build pipeline looks like after the split.


Figure 4: A multi-stage Docker build pipeline to build TypeScript.

Our new multi-stage build pipeline has two stages: Build stage 1 is what builds our TypeScript code; Build stage 2 is what creates our production Docker image. The final Docker image produced at the end of this pipeline contains only what it needs and omits the cruft we don’t want.

To create our two-stage build pipeline, we are basically just going to create two Docker files in one. Listing 3 shows our Dockerfile with multiple stages added. The first FROM command initiates the first stage, and the second FROM command initiates the second stage.

Compare this to a regular single-stage Dockerfile, and you can see that it actually looks like two Dockerfiles squished together in one.

Listing 3: A multi-stage Dockerfile for building TypeScript code

# 
# Build stage 1.
# This state builds our TypeScript and produces an intermediate Docker image containing the compiled JavaScript code.
#
FROM node:10.15.2

WORKDIR /usr/src/app
COPY package*.json ./
COPY tsconfig.json ./
RUN npm install
COPY ./src ./src
RUN npm run build

#
# Build stage 2.
# This stage pulls the compiled JavaScript code from the stage 1 intermediate image.
# This stage builds the final Docker image that we'll use in production.
#
FROM node:10.15.2

WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install --only=production
COPY --from=0 /usr/src/app/build ./build
EXPOSE 80
CMD npm start

To create this multi-stage Dockerfile, I simply took Listing 2 and divided it up into separate Dockerfiles. The first stage contains only what is need to build the TypeScript code. The second stage contains only what is needed to produce the final production Docker image. I then merged the two Dockerfiles into a single file.

The most important thing to note is the use of --from in the second stage. I’ve bolded this line in Listing 3 so you can easily pick it out. This is the syntax we use to pull the built files from our first stage, which we refer to here as stage 0. We are pulling the compiled JavaScript files from the first stage into the second stage.

We can easily check to make sure we got the desired result. After creating the new image and instantiating a container, I shelled in to check the contents of the file system. You can see in Figure 5 that we have successfully removed the debris from our production image.


Figure 5: We have removed the debris of development from our Docker image.

We now have fewer files in our image, it’s smaller, and it has less surface area. Yay! Mission accomplished.

But what, specifically, does this mean?

The effect of the multi-stage build

What exactly is the effect of the new build pipeline on our production image?

I measured the results before and after. Our single-stage image produced by Listing 2 weighs in at 955MB. After converting to the multi-stage build in Listing 3, the image now comes to 902MB. That’s a reasonable reduction — we removed 53MB from our image!

While 53MB seems like a lot, we have actually only shaved off just more than 5 percent of the size. I know what you’re going to say now: But Ash, our image is still monstrously huge! There’s still way too much bloat in that image.

Well, to make our image even smaller, we now need to use the alpine, or slimmed-down, Node.js base image. We can do this by changing our second build stage from node:10.15.2 to node:10.15.2-alpine.

This reduces our production image down to 73MB — that’s a huge win! Now the savings we get from discarding our debris is more like a whopping 60 percent. Alright, we are really getting somewhere now!

This highlights another benefit of multi-stage builds: we can use separate Docker base images for each of our build stages. This means you can customize each build stage by using a different base image.

Say you have one stage that relies on some tools that are in a different image, or you have created a special Docker image that is custom for your build process. This gives us a lot of flexibility when constructing our build pipelines.

How does it work?

You probably already guessed this: each stage or build process produces its own separate Docker image. You can see how this works in Figure 6.

The Docker image produced by a stage can be used by the following stages. Once the final image is produced, all the intermediate images are discarded; we take what we want for the final image, and the rest gets thrown away.


Figure 6: Each stage of a multi-stage Docker build produces an image.

Adding more stages

There’s no need to stop at two stages, although that’s often all that’s needed; we can add as many stages as we need. A specific example is illustrated in Figure 7.

Here we are building TypeScript code in stage 1 and our React client in stage 2. In addition, there’s a third stage that produces the final image from the results of the first two stages.


Figure 7: Using a Docker multi-stage build, we can create more complex build pipelines.

Pro tips

Now time to leave you with a few advanced tips to explore on your own:

  1. You can name your build stages! You don’t have to leave them as the default 0, 1, etc. Naming your build stages will make your Dockerfile more readable.
  2. Understand the options you have for base images. Using the right base image can relieve a lot of confusion when constructing your build pipeline.
  3. Build a custom base image if the complexity of your build process is getting out of hand.
  4. You can pull from external images! Just like you pull files from earlier stages, you can also pull files from images that are published to a Docker repository. This gives you an option to pre-bake an early build stage if it’s expensive and doesn’t change very often.
Conclusion and resources

Docker multi-stage builds enable us to create more complex build pipelines without having to resort to magic tricks. They help us slim down our production Docker images and remove the bloat. They also allow us to structure and modularize our build process, which makes it easier to test parts of our build process in isolation.

So please have some fun with Docker multi-stage builds, and don’t forget to have a look at the example code on GitHub.

Here’s the Docker documentation on multi-stage builds, too.