Exploring Docker: A Hands-On Guide for Absolute Beginners

Exploring Docker: A Hands-On Guide for Absolute Beginners

Docker Tutorial: A Hands-On Guide for Absolute Beginners, we will start to explore docker by talking about the benefits of Docker, looking at commands to work with containers and images, the Dockerfile and more. We will take a simple Node/Express/Mongo app and dockerize it by adding a Dockerfile and using docker-compose to bundle services together. We will also run our containers on a Digital Ocean droplet.

Exploring Docker [1] - Getting Started

In this video we will start to explore docker by talking about the benefits of Docker, looking at commands to work with containers and images, the Dockerfile and more

Exploring Docker [2] - Docker Compose With Node & MongoDB

In this video we will take a simple Node/Express/Mongo app and dockerize it by adding a Dockerfile and using docker-compose to bundle services together. We will also run our containers on a Digital Ocean droplet.

Docker Best Practices for Node Developers

Docker Best Practices for Node Developers

Welcome to the "Docker Best Practices for Node Developers"! With your basic knowledge of Docker and Node.js in hand, Docker Mastery for Node.js is a course for anyone on the Node.js path. This course will help you master them together.

Welcome to the best course on the planet for using Docker with Node.js! With your basic knowledge of Docker and Node.js in hand, Docker Mastery for Node.js is a course for anyone on the Node.js path. This course will help you master them together.

My talk on all the best of Docker for Node.js developers and DevOps dealing with Node apps. From DockerCon 2019. Get the full 9-hour training course with my coupon at http://bit.ly/365ogba

Get the source code for this talk at https://github.com/BretFisher/dockercon19

Some of the many cool things you'll do in this course
  • Build Node.js Images that auto-scan for security vulnerabilities
  • Use Docker's cutting-edge BuildKit with SSH Agents and NPM Caches for better image building
  • Use docker-compose with Visual Studio Code for full Node.js debug support
  • Use BuildKit and Multi-stage Builds to create minimal and flexible Dockerfiles
  • Build custom Node.js images using distro's like CentOS and Alpine
  • Test Docker init, tini, and Node.js as a PID 1 process in containers
  • Create Node.js apps that properly startup and respond to healthchecks
  • Develop ARM based Node.js apps with Docker Desktop, and deploy to AWS A1 Servers
  • Build graceful shutdown code into your apps for zero-downtime deploys
  • Dig into HTTP connections with orchestration, and how Proxies can help
  • Study examples of Docker Swarm and Kubernetes deployments for Node.js
  • Spend time Migrating traditional (legacy) Node.js apps into containers
  • Simplify your microservice solutions with advanced Docker Compose features
What you will learn in this course

You'll start with a quick review about getting set up with Docker, as well as Docker Compose basics. That way we're on the same page for the basics.

Then you'll jump into Node.js Dockerfile basics, that way you'll have a good Dockerfile foundation for new features we'll add throughout the course.

You'll be building on all the different things you learn from each Lecture in the course. Once you have the basics down of Compose, Dockerfile, and Docker Image, then you'll focus on nuances like how Docker and Linux control the Node process and how Docker changes that to make sure you know what options there are for starting up and shutting down Node.js and the right way to do it in different scenarios.

We'll cover advanced, newer features around making the Dockerfile the most efficient and flexible as possible using things like BuildKit and Multi-stage.

Then we'll talk about distributed computing and cloud design to ensure your Node.js apps have 12-factor design in your containers, as well as learning how to migrate old apps into this new way of doing things.

Next we cover Compose and its awesome features to get really efficient local development and test set-up using the Docker Compose command line and Docker Compose YAML file.

With all this knowledge, you'll progress to production concerns and making images production-ready.

Then we'll jump into deploying those containers and running them in production. Whether you use Docker Engine or orchestration with Kubernetes or Swarm, I've got you covered. In addition, we'll cover HTTP connections and reverse proxies for connection handling and routing with multi-container systems.

Lastly, you'll get a final, big assignment where you'll be building and deploying a large, complex solution, including multiple Node.js containers that are doing different things. You'll build Docker images, Dockerfiles, and compose files, and deploy them to a server to test. You'll need to check whether connections failover properly. You'll basically take everything you've learned and apply it in one big project!

Dockerize a Nodejs app connected to MongoDb

Dockerize a Nodejs app connected to MongoDb

I will run through some basic Docker terminology and concepts and then use a Node.js and MongoDB that I previously built and demonstrate how to run this in a Docker container.

Please find first part of series: 

Dockerize a Node.js app with VS Code

Problem:

You already know how to use Docker together with Node from previous article in this series. I know that we all love MERN/MEAN stacks. Our next step is to understand how Node and Mongo connects to each other running inside containers. Lets’ go!

1. Install MongoDb locally

Time to get into some document db stuff. First of all download MongoDb server from here.

If you haven’t change anything during install, it should also install a thing called MongoDb Compass Community.

This is a great tool to inspect, change, add or remove data in collections in MongoDb. You can connect to the local instance by using default address and port like on this image below or connect to any other server.

To connect locally just press connect. Inside you can see some default collections and you can play around. We will need MongoDb Compass a bit later.

2. Connect to MongoDb through Express app

In this tutorial I will be using my favorite editor Visual Studio Code. You will also need Nodejs and Docker installed. In my case I’m using Windows, so I got Docker for Windows from here.

Now run following command:

mkdir test-mongo-app && cd test-mongo-app && npm init -y && code .

Time to install dependencies. We will need express and mongoose packages.

npm i express mongoose

Create file called server.js inside root folder.

Also don’t forget to change your package.jsonto run server.js file at start.

{
 "name": "test-mongo-app",
 "version": "1.0.0",
 "description": "",
 "main": "server.js",
 "scripts": {
 "start": "node server.js"
 },
 "keywords": [],
 "author": "",
 "license": "ISC",
 "dependencies": {
 "express": "4.17.1",
 "mongoose": "5.6.1"
 }
}

Good. Let’s create a basic express app with two routes. One for reading Usersfrom the database and second is for adding dummy user data to it.

First of all check if everything works with just express server.

// server.js
const express = require("express");
const app = express();
const PORT = 8080;
app.get("/", (req, res) => {
 res.send("Hello from Node.js app \n");
});
app.listen(PORT, function() {
 console.log(`Listening on ${PORT}`);
});

You can run npm start to test it. If you see the message ”Listening on 8080” everything is ok. Also open http://localhost:8080 and check if you can see the hello message.

There is a nice thing called nodemon. It auto rebuilds our project when any changes happened in source code. Let’s use it! 😀

npm install — save-dev nodemon

Add a new command in package.json. So we use it for development.

 "scripts": {
 "start": "node server.js",
 "dev": "nodemon server.js"
 },

Now use run npm run dev while development instead of npm start.

npm run dev

You will notice difference in the console, because now nodemon is watching for any changes in your project and if needs, rebuild it. Change something in server.js and you will notice 😉

Now create folder src in root of the project. Here we will add all the rest files.

Let’s create a User model for mongoose. Create file names User.model.js

// User.model.js
const mongoose = require("mongoose");
const userSchema = new mongoose.Schema({
 username: {
 type: String
 }
});

const User = mongoose.model("User", userSchema);

module.exports = User;

Good! Here we defined a model for our document db. User model has only one field username which is a string. Enough for now :)

Let’s add a file called connection.js for connection to the database.


// connection.js
const mongoose = require("mongoose");
const User = require("./User.model");
const connection = "mongodb://localhost:27017/mongo-test";
const connectDb = () => {
return mongoose.connect(connection);
};
module.exports = connectDb;

Please notice that mongo-test will be the name of our database (cluster)!

Now modify a bit server.js and start the app. You should see message in the console that MongoDb is connected.


// server.js
const express = require("express");
const app = express();
const connectDb = require("./src/connection");
const PORT = 8080;
app.get("/users", (req, res) => {
res.send(“Get users \n”);
});
app.get("/user-create", (req, res) => {
res.send(“User created \n”);
});
app.listen(PORT, function() {
console.log(Listening on ${PORT});
connectDb().then(() => {
console.log("MongoDb connected");
});
});

Yeah! 🎉 We connected Express app with local MongoDb instance!

3. Implement read and write to MongoDb

We should implement two routes for reading and adding new users.

Open server.js file and first of all import our model on the top:


// server.js
const User = require("./src/User.model");
// …

Then implement both routes below like this:

// server.js
app.get("/users", async (req, res) => {
const users = await User.find();
res.json(users);
});
app.get("/user-create", async (req, res) => {
const user = new User({ username: "userTest" });
await user.save().then(() => console.log(“User created”));
res.send("User created \n");
});
// …

Be attentive here we are using async/await pattern.

Basically we implemented two routes /users and /user-create. ✋ Yeah, yeah, I know that create should be done through the POST http verb but just to make testing easier and escape configuring seed method for db.

Now it’s time to test! 🔍 Open in browser this link http://localhost:8080/user-create to create a dummy user record in db. Open this link http://localhost:8080/users to get all users as JSON in browser.

After doing that you can go back to MongoDb Compass and check users collection here. You should see this

4. Dockerize Node and MongoDb

Add Docker file to the root folder.

touch Dockerfile

Paste following inside it:

FROM node:8

Create app directory

WORKDIR /usr/src/app

Install app dependencies

COPY package*.json ./

RUN npm install

Copy app source code

COPY . .

#Expose port and start application
EXPOSE 8080
CMD [ "npm", "start" ]

We can simply build our express app with this command

docker build -t mongo-app .

But.. this will only run our express app, but not together with MongoDb. That’s why we need a docker-compose file. 🐳

Now create another file called docker-compose.yml and paste this:


version: "2"
services:
web:
build: .
ports:
— "8080:8080"
depends_on:
— mongo
mongo:
image: mongo
ports:
— "27017:27017"

We defined 2 services in this file. One is our node app running on port 8080 and the other is mongodb instance.

⚠️ Before you run next command, please make sure that you changed connection string to mongo db in connection.js file.

const connection = "mongodb://mongo:27017/mongo-test";

We replaced localhost with mongo which is very important. Because we should tell the app that we want to access MongoDb from docker internal virtual network and not the local one.

Now run the magic command 🔮

docker-compose up

Open a browser on http://localhost:8080/users and http://localhost:8080/user-create to see our app running in Docker.

(In case anything doesn’t work, try to stop/remove image and containers, rebuild it by ruining docker compose-up again and if mongo image is not pulled from hub try to re-login in into docker hub or restart docker for Windows)

Thanks for reading !

🚀 If you read something interesting from that article, please like and follow me for more posts. Thank you dear codern and share it with all of your programming buddies!

Further reading

☞ Build a Basic App with Spring Boot and JPA using PostgreSQL

☞ Build a Simple CRUD App with Spring Boot and Vue.js

☞ Introducing TensorFlow.js: Machine Learning in Javascript

☞ An illustrated guide to Kubernetes Networking

☞ Google Kubernetes Engine By Example

☞ AWS DevOps: Introduction to DevOps on AWS

☞ Docker Tutorial for Beginners

☞ Kotlin Microservices With Micronaut, Spring Cloud, and JPA



Originally published on https://itnext.io

Scaling Node.js Applications with Kubernetes and Docker

Scaling Node.js Applications with Kubernetes and Docker

Scaling Node.js Applications with Kubernetes and Docker. We will explore the benefits of DevOps process using Kubernetes, Docker, and Node.js. Learn about the basics of Kubernetes and tips to scale Node.js Applications. Learn the common problems that we face when we decide to change from monoliths to microservices using Docker and JavaScript.

We will explore the benefits of DevOps process using Kubernetes, Docker, and Node.js. Showing how Docker and Node.js can work together, using the power of Kubernetes to release and to scale automatically stateless services. At this talk we will explore the key concepts and components to start working with Kubernetes, real scenarios and the differences between the traditional approach compared to Container based applications. Attendees will learn about the basics of Kubernetes and tips to scale Node.js applications furthermore they will learn the common problems that we face when we decide to change from monoliths to microservices using Docker and JavaScript.

What are the key takeaways from this talk?

  • Service communication
  • Kubernetes and Docker,
  • High availability & release process