A Complete Guide on Deploying a Node app to AWS with Docker

A Complete Guide on Deploying a Node app to AWS with Docker

Once you've got a web application running locally on your machine, if you want to access it on the internet, you've got to deploy it. And instead of just deploying it manually on a virtual machine in the cloud, let's dockerize the app and then deploy it to the cloud.

Table of Contents

1. Introduction

2. Prerequisites

3. A quick primer on Docker and AWS

4. What we’ll be deploying

5. Creating a Dockerfile

6. Building a docker image

7. Running a docker container

8. Creating the Registry (ECR) and uploading the app image to it

9. Creating a new task definition

10. Creating a cluster

11. Creating a service to run it

12. Conclusion

1. Introduction

Writing code that does stuff is something most developers are familiar with. Sometimes, we need to take the responsibility of a SysAdmin or DevOps engineer and deploy our codebase to production where it will help a business solve problems for customers.

In this tutorial, I’ll show you how to dockerize a Node.js application and deploy it to Amazon Web Service (AWS) using Amazon ECR (Elastic Container Registry) and ECS (Elastic container service).

2. Prerequisites

To follow through this tutorial, you’ll need the following:

  1. Node and Npm: Follow this link to install the latest versions.
  2. Basic knowledge of Node.js.
  3. Docker: The installation provides Docker Engine, Docker CLI client, and other cool stuff. Follow the instructions for your operating system. To check if the installation worked, fire this on the terminal:
<pre class="ql-syntax" spellcheck="false">docker --version </pre>

The command above should display the version number. If it doesn’t, the installation didn’t complete properly.

4. AWS account: Sign up for a free tier. There is a waiting period to verify your phone number and bank card. After this, you will have access to the console.

5. AWS CLI: Follow the instructions for your OS. You need Python installed.

3. A quick primer on Docker and AWS

Docker is an open source software that allows you to pack an application together with the required dependencies and environment in a ‘Container’ that you can ship and run anywhere. It is independent of platforms or hardware, and therefore the containerized application can run in any environment in an isolated fashion.

Docker containers solve many issues, such as when an app works on a co-worker’s computer but doesn’t run on yours, or it works in the local development environment but doesn’t work when you deploy it to a server.

A Complete Guide on Deploying a Node app to AWS with Docker

Amazon Web Services (AWS) offers a reliable, scalable, and inexpensive cloud computing service for businesses. As I mentioned before, this tutorial will focus on using the ECR and ECS services.

4. What we’ll be deploying

Let’s quickly build a sample app that we’ll use for the purpose of this tutorial. It going to be very simple Node.js app.

Enter the following in your terminal:

// create a new directory
mkdir sample-nodejs-app
// change to new directory
cd sample-nodejs-app
// Initialize npm
npm init -y
// install express
npm install express
// create an server.js file
touch server.js

Open server.js and paste the code below into it:

// server.js
const express = require('express')const app = express()
app.get('/', (req, res) => {    res.send('Hello world from a Node.js app!')})
app.listen(3000, () => {    console.log('Server is up on 3000')})

Start the app with:

<pre class="ql-syntax" spellcheck="false">node server.js </pre>

Access it on http://localhost:3000. You should get Hello world from a Node.js app! displayed in your browser. The complete code is available on GitHub.

Now let’s take our very important app to production 😄.

5. Creating a Dockerfile

We are going to start dockerizing the app by creating a single file called a Dockerfile in the base of our project directory.

The Dockerfile is the blueprint from which our images are built. And then images turn into containers, in which we run our apps.

Every Dockerfile starts with a base image as its foundation. There are two ways to approach creating your Dockerfile:

  1. Use a plain OS base image (For example, Ubuntu OS, Debian, CentOS etc.) and install an application environment in it such as Node.js OR
  2. Use an environment-ready base image to get an OS image with an application environment already installed.

We will proceed with the second approach. We can use the official Node.js image hosted on Dockerhub which is based on Alpine Linux.

Write this in the Dockerfile:

FROM node:8-alpine
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY . .
RUN npm install
EXPOSE 3000
CMD [ "node", "server.js" ]

Let’s walk through this line by line to see what is happening here, and why.

<pre class="ql-syntax" spellcheck="false">FROM node:8-alpine </pre>

Here, we are building our Docker image using the official Node.js image from Dockerhub (a repository for base images).

  • Start our Dockerfile with a [**FROM**](https://docs.docker.com/reference/builder/#from) statement. This is where you specify your base image.
  • The [**RUN**](https://docs.docker.com/reference/builder/#run) statement will allow us to execute a command for anything you want to do. We created a subdirectory /usr/src/app that will hold our application code within the docker image.
  • [**WORKDIR**](https://docs.docker.com/engine/reference/builder/#workdir) instruction establishes the subdirectory we created as the working directory for any RUN, CMD, ENTRYPOINT, COPY and ADD instructions that follow it in the Dockerfile. /usr/src/app is our working directory.
  • [**COPY**](https://docs.docker.com/engine/reference/builder/#copy) lets us copy files from a source to a destination. We copied the contents of our node application code ( server.js and package.json) from our current directory to the working directory in our docker image.
  • The [**EXPOSE**](https://docs.docker.com/engine/reference/builder/#expose) instruction informs Docker that the container listens on the specified network ports at runtime. We specified port 3000.
  • Last but not least, the[**CMD**](https://docs.docker.com/reference/builder/#cmd) statement specifies the command to start our application. This tells Docker how to run your application. Here we use node server.js which is typically how files are run in Node.js.

With this completed file, we are now ready to build a new Docker image.

6. Building a docker image

Make sure you have Docker up and running. Now that we have defined our Dockerfile, let’s build the image with a title using -t:

<pre class="ql-syntax" spellcheck="false">docker build -t sample-nodejs-app . </pre>

This will output hashes, and alphanumeric strings that identify containers and images saying “Successfully built” on the last line:

Sending build context to Docker daemon  1.966MB
Step 1/7 : FROM node:6-alpine
 ---> 998971a692ca
Step 2/7 : RUN mkdir -p /usr/src/app
 ---> Using cache
 ---> f1aa1c112188
Step 3/7 : WORKDIR /usr/src/app
 ---> Using cache
 ---> b4421b83357b
Step 4/7 : COPY . .
 ---> 836112e1d526
Step 5/7 : RUN npm install
 ---> Running in 1c6b36b5381c
npm WARN [email protected] No description
npm WARN [email protected] No repository field.
Removing intermediate container 1c6b36b5381c
 ---> 93999e6c807f
Step 6/7 : EXPOSE 3000
 ---> Running in 7419020927f1
Removing intermediate container 7419020927f1
 ---> ed4ac8a31f83
Step 7/7 : CMD [ "node", "server.js" ]
 ---> Running in c77d34f4c873
Removing intermediate container c77d34f4c873
 ---> eaf97859f909
Successfully built eaf97859f909

// dont expect the same values from your terminal.

7. Running a Docker Container

We’ve built the docker image. To see previously created images, run:

<pre class="ql-syntax" spellcheck="false">docker images </pre>

You should see the image we just created as the most recent based on time:

A Complete Guide on Deploying a Node app to AWS with Docker

Copy the image Id. To run the container, we write on the terminal:

docker run -p 80:3000 {image-id}

// fill with your image-id


By default, Docker containers can make connections to the outside world, but the outside world cannot connect to containers. -p publishes all exposed ports to the host interfaces. Here we publish the app to port 80:3000. Because we are running Docker locally, go to http://localhost to view.

A Complete Guide on Deploying a Node app to AWS with Docker

At any moment, you can check running Docker containers by typing:

<pre class="ql-syntax" spellcheck="false">docker container ls </pre>

Finally, you can stop the container from running by:

<pre class="ql-syntax" spellcheck="false">docker stop {image-id} </pre>

Leave the Docker daemon running.

8. Create Registry (ECR) and upload the app image to it

Amazon Elastic Container Registry (ECR) is a fully-managed Docker container registry that makes it easy for developers to store, manage, and deploy Docker container images. Amazon ECR is integrated with Amazon Elastic Container Service (ECS), simplifying your development to production workflow.

The keyword “Elastic” means you can scale the capacity or reduce it as desired.

Steps:

  1. Go to the AWS console and sign in.
  2. Select the EC2 container service and Get started

3. The first run page shows, scroll down and click cancel > enter ECS dashboard.

4. To ensure your CLI can connect with your AWS account, run on the terminal:

<pre class="ql-syntax" spellcheck="false">aws configure </pre>

If your AWS CLI was properly installed, aws configure will ask for the following:

$ aws configure
AWS Access Key ID [None]: accesskey
AWS Secret Access Key [None]: secretkey
Default region name [None]: us-west-2
Default output format [None]:

Get the security credentials from your AWS account under your username > Access keys. Run aws configure again and fill correctly.

4. Create a new repository and enter a name (preferably with the same container name as in your local dev environment for consistency).

For example, use sample-nodejs-app.

A Complete Guide on Deploying a Node app to AWS with Docker

Follow the 5 instructions from the AWS console for building, tagging, and pushing Docker images:

Note: The arguments of the following are mine and will differ from yours, so just follow the steps outlined on your console.

  1. Retrieve the Docker login command that you can use to authenticate your Docker client to your registry:
  2. Note: If you receive an “Unknown options: - no-include-email” error, install the latest version of the AWS CLI. Learn more here.
<pre class="ql-syntax" spellcheck="false">aws ecr get-login --no-include-email --region us-east-2 </pre>

2. Run the docker login command that was returned in the previous step (just copy and paste). Note: If you are using Windows PowerShell, run the following command instead:

<pre class="ql-syntax" spellcheck="false">Invoke-Expression -Command (aws ecr get-login --no-include-email --region us-east-2) </pre>

It should output: Login Succeeded.

3. Build your Docker image using the following command. For information on building a Docker file from scratch, see the instructions here. You can skip this step since our image is already built:

<pre class="ql-syntax" spellcheck="false">docker build -t sample-nodejs-app . </pre>

4. With a completed build, tag your image with a keyword (For example, latest) so you can push the image to this repository:

<pre class="ql-syntax" spellcheck="false">docker tag sample-nodejs-app:latest 559908478199.dkr.ecr.us-east-2.amazonaws.com/sample-nodejs-app:latest </pre>

5. Run the following command to push this image to your newly created AWS repository:

<pre class="ql-syntax" spellcheck="false">docker push 559908478199.dkr.ecr.us-east-2.amazonaws.com/sample-nodejs-app:latest </pre>

9. Create a new task definition

Tasks function like the docker run command of the Docker CLI but for multiple containers. They define:

  • Container images (to use)
  • Volumes (if any)
  • Networks Environment variables
  • Port mappings

From Task Definitions in the ECS dashboard, press on the Create new Task Definition (ECS) button:

A Complete Guide on Deploying a Node app to AWS with Docker

Set a task name and use the following steps:

  • Add Container: sample-nodejs-app (the one we pushed).
  • Image: the URL to your container. Mine is 559908478199.dkr.ecr.us-east-2.amazonaws.com/sample-nodejs-app
  • Soft limit: 512
  • Map 80 (host) to 3000 (container) for sample-nodejs-app
  • Env Variables:

NODE_ENV: production

10. Create a Cluster

A cluster is the place where AWS containers run. They use configurations similar to EC2 instances. Define the following:

  • Cluster name: demo-nodejs-app-cluster
  • EC2 instance type: t2.micro

(Note: you select the instances based on the size of your application. Here we’ve selected the smallest. Your selection affects how much money you are billed at the end of the month. Visit here for more information). Thank you Nicholas Kolatsis for pointing out that the previous selection of m4.large was expensive for this tutorial.

  • Number of instances: 1
  • EBS storage: 22
  • Key pair: None
  • VPC: New

When the process is complete, you may choose to click on “View cluster.”

11. Create a service to run it

Go to Task Definition > click demo-nodejs-app > click on the latest revision.

A Complete Guide on Deploying a Node app to AWS with Docker

Inside of the task definition, click on the actions dropdown and select Create servcie

Use the following:

  • Launch type: EC2
  • Service name: demo-nodejs-app-service
  • Number of tasks: 1

Skip through options and click Create service and View service.

A Complete Guide on Deploying a Node app to AWS with Docker

You’ll see its status as PENDING. Give it a little time and it will indicate RUNNING.

Go to Cluster (through a link from the service we just created) > EC2 instances > Click on the container instance to reveal the public DNS.

A Complete Guide on Deploying a Node app to AWS with Docker

A Complete Guide on Deploying a Node app to AWS with Docker

Visit the public DNS to view our app! Mine is [ec2–18–219–113–111.us-east-2.compute.amazonaws.com](http://ec2-18-219-113-111.us-east-2.compute.amazonaws.com/)

12. Conclusion.

Congrats on finishing this post! Grab the code for the Docker part from Github.

Thanks for reading

If you liked this post, share it with all of your programming buddies!

Follow us on Facebook | Twitter

Further reading

The Complete Node.js Developer Course (3rd Edition)

Angular & NodeJS - The MEAN Stack Guide

NodeJS - The Complete Guide (incl. MVC, REST APIs, GraphQL)

MongoDB - The Complete Developer’s Guide

The Complete Developers Guide to MongoDB

Creating RESTful APIs with NodeJS and MongoDB Tutorial

MEAN Stack Tutorial MongoDB, ExpressJS, AngularJS and NodeJS

How To Build a Node.js Application with Docker

Authenticate a Node ES6 API with JSON Web Tokens

Creating a RESTful Web API with Node.js and Express.js from scratch

How to setting up Node API with Typescript

How to setting up Node API with Typescript

Note: You should have Nodejs installed on your machine.

First thing is to create our project folder and initialize it with npm to generate the package.json file.

Install dependencies

npm i express --save
npm i @types/node @types/express ts-node typescript nodemon --save-dev

Create a tsconfig.json file in the root of your application or run npx tsc --init on your terminal and add the configuration below.

{ 
"compilerOptions":
  {
  "target": "es6",
  "module": "commonjs",
  "allowJs": true,
  "outDir": "./build",
  "rootDir": "./src",
  "esModuleInterop": true
  }
}

Note: More options can be added to the tsconfig.json file.
Find out more here.

Add scripts to package.json file.

"scripts": 
  {
    "dev": "nodemon src/app.ts",
    	"start": "tsc && node build/app"
    }

Create a src directory where our application would be built. Inside the src directory, create an app.ts file.

Inside the app.ts file, add the code below.

import express, { Application, Request, Response, NextFunction } from "express";

const app: Application = express();

app.use(express.json());

app.get("/", (req: Request, res: Response): object => {
    return res.json({ status: "success", message: "Welcome to API Service" });
  }
);

app.use((req: Request, res: Response, next: NextFunction) => {
  const error = new Error("Route Not found");
  next(error);
});

app.use((error: { message: string; status: number }, req: Request, res: Response,next: NextFunction
  ) => {
    res.status(error.status || 500);
    res.json({
      status: "error",
      message: error.message
    });
    next();
  }
);

const PORT: any = process.env.PORT || 3000;

app.listen(PORT, () => console.log(`app listening on port ${PORT}`));

At this point, your project structure should look like the image below.

This is image title

Development

To run the application on the development environment, run the command below

npm run dev

Note: The above command compiles the files found in the src directory in memory.

Production

To run the application on the production environment, run the command below

npm start

Note: The above command compiles the files found in the src directory to a build directory and runs the app.js file in the build directory, as specified above in the start script in our package.json file.

The project used in this article can be found here.

Thanks for reading.

Angular 7 CRUD with Nodejs and MySQL Example

Angular 7 CRUD with Nodejs and MySQL Example

<span class="ql-cursor"></span>Below are the requirements for creating the CRUD on MEAN

  • Node.js
  • Angular CLI
  • Angular 7
  • Mysql
  • IDE or Text Editor

We assume that you have already available the above tools/frameworks and you are familiar with all the above that what individually actually does.

So now we will proceed step by step to achieve the task.

1. Update Angular CLI and Create Angular 7 Application

At first, We have to update the Angular CLI to the latest version. Open the terminal then go to the project folder and then type the below command to update the Angular CLI

<pre class="ql-syntax" spellcheck="false">sudo npm install -g @angular/cli </pre>

Once the above task finishes, Next task is to create new angular application with below command. So go to your project folder and then type below command:

<pre class="ql-syntax" spellcheck="false">ng new angular7-crud </pre>

then go to the newly created folder of angular application with **cd /angular7-crud ** and type **ng serve. **Now, open the browser then go to <a href="http://localhost:4200" title="" target="_blank">http://localhost:4200</a> you should see this page.

2. Create a server with node.js express and Mysql for REST APIs

create a separate folder named server for server-side stuff, Then move inside folder and create server.js by typing touch server.js

Let’s have a look on the server.js file

<pre class="ql-syntax" spellcheck="false">let app = require('express')(), server = require('http').Server(app), bodyParser = require('body-parser') express = require('express'), cors = require('cors'), http = require('http'), path = require('path');   let articleRoute = require('./Routes/article'), util = require('./Utilities/util');   app.use(bodyParser.json()); app.use(bodyParser.urlencoded({extended: false }));   app.use(cors());   app.use(function(err, req, res, next) { return res.send({ "statusCode": util.statusCode.ONE, "statusMessage":util.statusMessage.SOMETHING_WENT_WRONG }); });   app.use('/article', articleRoute);   // catch 404 and forward to error handler app.use(function(req, res, next) { next(); });   /*first API to check if server is running*/ app.get('*', (req, res) => { res.sendFile(path.join(__dirname, '../server/client/dist/index.html')); })     server.listen(3000,function(){ console.log('app listening on port: 3000'); }); </pre>

In the above file we can see, at the top, there are required packages for the app. Below that body parsing, middleware and routing is done.

The next task is to create routes and create a file article.js . So creating a folder name ‘Routes’ and adding article.js within it.

Add the below code for routing in article.js inside routing folder

<pre class="ql-syntax" spellcheck="false">let express = require('express'), router = express.Router(), util = require('../Utilities/util'), articleService = require('../Services/article');   /**Api to create article */ router.post('/create-article', (req, res) => { articleService.createArticle(req.body, (data) => { res.send(data); }); });   // /**Api to update article */ router.put('/update-article', (req, res) => { articleService.updateArticle(req.body, (data) => { res.send(data); }); });   // /**Api to delete the article */ router.delete('/delete-article', (req, res) => { articleService.deleteArticle(req.query, (data) => { res.send(data); }); });   /**Api to get the list of article */ router.get('/get-article', (req, res) => { documentService.getArticle(req.query, (data) => { res.send(data); }); });   // /**API to get the article by id... */ router.get('/get-article-by-id', (req, res) => { articleService.getArticleById(req.query, (data) => { res.send(data); }); });   module.exports = router; </pre>


Now create a folder named Utilities for all config, common methods and mysql connection config.

Now I am adding config values in a file named config.js

<pre class="ql-syntax" spellcheck="false">let environment = "dev";   let serverURLs = { "dev": { "NODE_SERVER": "http://localhost", "NODE_SERVER_PORT": "3000", "MYSQL_HOST": 'localhost', "MYSQL_USER": 'root', "MYSQL_PASSWORD": 'password', 'MYSQL_DATABASE': 'demo_angular7_crud', } }   let config = { "DB_URL_MYSQL": { "host": `${serverURLs[environment].MYSQL_HOST}`, "user": `${serverURLs[environment].MYSQL_USER}`, "password": `${serverURLs[environment].MYSQL_PASSWORD}`, "database": `${serverURLs[environment].MYSQL_DATABASE}` }, "NODE_SERVER_PORT": { "port": `${serverURLs[environment].NODE_SERVER_PORT}` }, "NODE_SERVER_URL": { "url": `${serverURLs[environment].NODE_SERVER}` } };   module.exports = { config: config }; </pre>

Now configure mysql connection. So I am writing the connection with database in a separate file. So creating a file named mysqkConfig.js under Utilities folder and adding the below line of code for mysql connection:

 

<pre class="ql-syntax" spellcheck="false">var config = require("../Utilities/config").config; var mysql = require('mysql'); var connection = mysql.createConnection({ host: config.DB_URL_MYSQL.host, user: config.DB_URL_MYSQL.user, password: config.DB_URL_MYSQL.password, database: config.DB_URL_MYSQL.database, });   connection.connect(() => { require('../Models/Article').initialize(); });   let getDB = () => { return connection; }   module.exports = { getDB: getDB } </pre>

Now I am creating separate file name util.js to save common methods and common status code/message:

<pre class="ql-syntax" spellcheck="false">// Define Error Codes let statusCode = { OK: 200, FOUR_ZERO_FOUR: 404, FOUR_ZERO_THREE: 403, FOUR_ZERO_ONE: 401, FIVE_ZERO_ZERO: 500 };   // Define Error Messages let statusMessage = { SERVER_BUSY : 'Our Servers are busy. Please try again later.', DATA_UPDATED: 'Data updated successfully.', DELETE_DATA : 'Delete data successfully',   };   module.exports = { statusCode: statusCode, statusMessage: statusMessage } </pre>

Now the next part is model, So create a folder named Models and create a file Article.js and add the below code in it:

<pre class="ql-syntax" spellcheck="false">let mysqlConfig = require("../Utilities/mysqlConfig");   let initialize = () => { mysqlConfig.getDB().query("create table IF NOT EXISTS article (id INT auto_increment primary key, category VARCHAR(30), title VARCHAR(24))");   }   module.exports = { initialize: initialize } </pre>

Now create DAO folder and add a file articleDAO.js for writting the mysql queries common functions:

<pre class="ql-syntax" spellcheck="false">let dbConfig = require("../Utilities/mysqlConfig");

 
let getArticle = (criteria, callback) => {
//criteria.aricle_id ? conditions += and aricle_id = '${criteria.aricle_id}' : true;
dbConfig.getDB().query(select * from article where 1,criteria, callback);
}
 
let getArticleDetail = (criteria, callback) => {
    let conditions = "";
criteria.id ? conditions +=  and id = '${criteria.id}' : true;
dbConfig.getDB().query(select * from article where 1&nbsp;${conditions}, callback);
}
 
let createArticle = (dataToSet, callback) => {
console.log("insert into article set ? ", dataToSet,'pankaj')
dbConfig.getDB().query("insert into article set ? ", dataToSet, callback);
}
 
let deleteArticle = (criteria, callback) => {
let conditions = "";
criteria.id ? conditions +=  and id = '${criteria.id}' : true;
console.log(delete from article where 1&nbsp;${conditions});
dbConfig.getDB().query(delete from article where 1&nbsp;${conditions}, callback);
 
}
 
let updateArticle = (criteria,dataToSet,callback) => {
    let conditions = "";
let setData = "";
criteria.id ? conditions +=  and id = '${criteria.id}' : true;
dataToSet.category ? setData += category = '${dataToSet.category}' : true;
dataToSet.title ? setData += , title = '${dataToSet.title}' : true;
console.log(UPDATE article SET&nbsp;${setData}&nbsp;where 1&nbsp;${conditions});
dbConfig.getDB().query(UPDATE article SET&nbsp;${setData}&nbsp;where 1&nbsp;${conditions}, callback);
}
module.exports = {
getArticle : getArticle,
createArticle : createArticle,
deleteArticle : deleteArticle,
updateArticle : updateArticle,
getArticleDetail : getArticleDetail
}
</pre>

Now one create Services folder and add a file article.js for all the logic of API

<pre class="ql-syntax" spellcheck="false"> let async = require('async'),
parseString = require('xml2js').parseString;
 
let util = require('../Utilities/util'),
articleDAO = require('../DAO/articleDAO');
//config = require("../Utilities/config").config;
 
 
/**API to create the atricle */
let createArticle = (data, callback) => {
async.auto({
article: (cb) => {
var dataToSet = {
"category":data.category?data.category:'',
"title":data.title,
}
console.log(dataToSet);
articleDAO.createArticle(dataToSet, (err, dbData) => {
if (err) {
cb(null, { "statusCode": util.statusCode.FOUR_ZERO_ONE, "statusMessage":util.statusMessage.SERVER_BUSY });
return;
}
 
cb(null, { "statusCode": util.statusCode.OK, "statusMessage":util.statusMessage.DATA_UPDATED,"result":dataToSet });
});
}
//]
}, (err, response) => {
callback(response.article);
});
}
 
/**API to update the article */
let updateArticle = (data,callback) => {
async.auto({
articleUpdate :(cb) =>{
if (!data.id) {
cb(null, { "statusCode": util.statusCode.FOUR_ZERO_ONE, "statusMessage":util.statusMessage.PARAMS_MISSING })
return;
}
console.log('phase 1');
var criteria = {
id : data.id,
}
var dataToSet={
"category": data.category,
"title":data.title,
}
console.log(criteria,'test',dataToSet);
                    articleDAO.updateArticle(criteria, dataToSet, (err, dbData)=>{
                        if(err){
cb(null,{"statusCode":util.statusCode.FOUR_ZERO_ONE,"statusMessage":util.statusMessage.SERVER_BUSY});
                        return; 
                        }
                        else{
cb(null, { "statusCode": util.statusCode.OK, "statusMessage":util.statusMessage.DATA_UPDATED,"result":dataToSet });                        
                        }
                    });
}
}, (err,response) => {
callback(response.articleUpdate);
});
}
 
/**API to delete the subject */
let deleteArticle = (data,callback) => {
console.log(data,'data to set')
async.auto({
removeArticle :(cb) =>{
if (!data.id) {
cb(null, { "statusCode": util.statusCode.FOUR_ZERO_ONE, "statusMessage":util.statusMessage.PARAMS_MISSING })
return;
}
var criteria = {
id : data.id,
}
articleDAO.deleteArticle(criteria,(err,dbData) => {
if (err) {
console.log(err);
cb(null, { "statusCode": util.statusCode.FOUR_ZERO_ONE, "statusMessage":util.statusMessage.SERVER_BUSY });
return;
}
cb(null, { "statusCode": util.statusCode.OK, "statusMessage": util.statusMessage.DELETE_DATA });
});
}
}, (err,response) => {
callback(response.removeArticle);
});
}
 
/***API to get the article list */
let getArticle = (data, callback) => {
async.auto({
article: (cb) => {
articleDAO.getArticle({},(err, data) => {
if (err) {
cb(null, {"errorCode": util.statusCode.INTERNAL_SERVER_ERROR,"statusMessage":util.statusMessage.SERVER_BUSY});
return;
}
cb(null, data);
return;
});
}
}, (err, response) => {
callback(response.article);
})
}
 
/***API to get the article detail by id */
let getArticleById = (data, callback) => {
async.auto({
article: (cb) => {
let criteria = {
"id":data.id
}
articleDAO.getArticleDetail(criteria,(err, data) => {
if (err) {
console.log(err,'error----');
cb(null, {"errorCode": util.statusCode.INTERNAL_SERVER_ERROR,"statusMessage":util.statusMessage.SERVER_BUSY});
return;
}
cb(null, data[0]);
return;
});
}
}, (err, response) => {
callback(response.article);
})
}
 
module.exports = {
createArticle : createArticle,
updateArticle : updateArticle,
deleteArticle : deleteArticle,
getArticle : getArticle,
getArticleById : getArticleById
};
</pre>

3. Create angular component for performing CRUD task of article

<pre class="ql-syntax" spellcheck="false">ng g component article
</pre>

Above command will generate all required files for build article component and also automatically added this component to app.module.ts.

<pre class="ql-syntax" spellcheck="false">create src/app/article/article.component.css (0 bytes)
create src/app/article/article.component.html (23 bytes)
create src/app/article/article.component.spec.ts (614 bytes)
create src/app/article/article.component.ts (321 bytes)
update src/app/app.module.ts (390 bytes)
</pre>

Now we need to add HttpClientModule to app.module.ts. Open and edit src/app/app.module.ts then add this import. And add it to @NgModule imports after BrowserModule. Now our app.module.ts will have following code:

<pre class="ql-syntax" spellcheck="false"> import { NgModule } from '@angular/core';
import { BrowserModule } from '@angular/platform-browser';
import { ReactiveFormsModule } from '@angular/forms';
import { HttpModule } from '@angular/http';
 
import { AppComponent } from './app.component';
import { ArticleComponent } from './article.component';
import { ArticleService } from './article.service';
 
@NgModule({
imports: [
BrowserModule,
HttpModule,
ReactiveFormsModule
],
declarations: [
AppComponent,
ArticleComponent
],
providers: [
ArticleService
],
bootstrap: [
AppComponent
]
})
export class AppModule { }
</pre>

Now create a service file where we will make all the request to the server for CRUD operation. Command for creating service is ng g service artcle , for now I have just created a file named it article.service.ts. Let's have a look in the code inside this file.

<pre class="ql-syntax" spellcheck="false">import { Injectable } from '@angular/core';
import { Http, Response, Headers, URLSearchParams, RequestOptions } from '@angular/http';
import { Observable } from 'rxjs';
import 'rxjs/add/operator/map';
import 'rxjs/add/operator/catch';
 
import { Article } from './article';
 
@Injectable()
export class ArticleService {
//URL for CRUD operations
    articleUrl = "http://localhost:3000/article";
    //Create constructor to get Http instance
    constructor(private http:Http) {
    }
    
    //Fetch all articles
getAllArticles(): Observable<Article[]> {
return this.http.get(this.articleUrl+"/get-article")
              .map(this.extractData)
         .catch(this.handleError);
 
}
    //Create article
createArticle(article: Article):Observable<number> {
     let cpHeaders = new Headers({ 'Content-Type': 'application/json' });
let options = new RequestOptions({ headers: cpHeaders });
return this.http.post(this.articleUrl+"/create-article", article, options)
.map(success => success.status)
.catch(this.handleError);
}
    //Fetch article by id
getArticleById(articleId: string): Observable<Article> {
        let cpHeaders = new Headers({ 'Content-Type': 'application/json' });
        let options = new RequestOptions({ headers: cpHeaders });
        console.log(this.articleUrl +"/get-article-by-id?id="+ articleId);
        return this.http.get(this.articleUrl +"/get-article-by-id?id="+ articleId)
             .map(this.extractData)
             .catch(this.handleError);
}   
    //Update article
updateArticle(article: Article):Observable<number> {
     let cpHeaders = new Headers({ 'Content-Type': 'application/json' });
        let options = new RequestOptions({ headers: cpHeaders });
return this.http.put(this.articleUrl +"/update-article", article, options)
.map(success => success.status)
.catch(this.handleError);
}
//Delete article    
deleteArticleById(articleId: string): Observable<number> {
        let cpHeaders = new Headers({ 'Content-Type': 'application/json' });
        let options = new RequestOptions({ headers: cpHeaders });
        return this.http.delete(this.articleUrl +"/delete-article?id="+ articleId)
             .map(success => success.status)
             .catch(this.handleError);
}   
    private extractData(res: Response) {
        let body = res.json();
return body;
}
private handleError (error: Response | any) {
        console.error(error.message || error);
        return Observable.throw(error.status);
}
}
</pre>

In the above file we have made all the http request for the CRUD operation. Observables of rxjs library has been used to handle the data fetching from http request.

 

Now let's move to the next file, article.component.ts. Here we have all the login part of the app. Let's have a look code inside this file:

<pre class="ql-syntax" spellcheck="false">import { Component, OnInit } from '@angular/core';
import { FormControl, FormGroup, Validators } from '@angular/forms';
 
import { ArticleService } from './article.service';
import { Article } from './article';
 
@Component({
selector: 'app-article',
templateUrl: './article.component.html',
styleUrls: ['./article.component.css']
})
export class ArticleComponent implements OnInit {
//Component properties
allArticles: Article[];
statusCode: number;
requestProcessing = false;
articleIdToUpdate = null;
processValidation = false;
//Create form
articleForm = new FormGroup({
title: new FormControl('', Validators.required),
category: new FormControl('', Validators.required)   
});
//Create constructor to get service instance
constructor(private articleService: ArticleService) {
}
//Create ngOnInit() and and load articles
ngOnInit(): void {
     this.getAllArticles();
}
//Fetch all articles
 
getAllArticles() {
        this.articleService.getAllArticles()
         .subscribe(
data => this.allArticles = data,
                errorCode => this.statusCode = errorCode);
                
}
//Handle create and update article
onArticleFormSubmit() {
     this.processValidation = true;
     if (this.articleForm.invalid) {
     return; //Validation failed, exit from method.
     }
     //Form is valid, now perform create or update
this.preProcessConfigurations();
     let article = this.articleForm.value;
     if (this.articleIdToUpdate === null) {
     //Generate article id then create article
this.articleService.getAllArticles()
     .subscribe(articles => {
            
         //Generate article id    
         let maxIndex = articles.length - 1;
         let articleWithMaxIndex = articles[maxIndex];
         let articleId = articleWithMaxIndex.id + 1;
         article.id = articleId;
         console.log(article,'this is form data---');
         //Create article
    this.articleService.createArticle(article)
             .subscribe(successCode => {
                    this.statusCode = successCode;
                    this.getAllArticles();  
                    this.backToCreateArticle();
                 },
                 errorCode => this.statusCode = errorCode
             );
         });        
     } else {
  //Handle update article
article.id = this.articleIdToUpdate;        
     this.articleService.updateArticle(article)
     .subscribe(successCode => {
         this.statusCode = successCode;
                 this.getAllArticles();  
                    this.backToCreateArticle();
             },
         errorCode => this.statusCode = errorCode);  
     }
}
//Load article by id to edit
loadArticleToEdit(articleId: string) {
this.preProcessConfigurations();
this.articleService.getArticleById(articleId)
     .subscribe(article => {
            console.log(article,'poiuytre');
         this.articleIdToUpdate = article.id;
                    this.articleForm.setValue({ title: article.title, category: article.category });
                    this.processValidation = true;
                    this.requestProcessing = false;
         },
         errorCode => this.statusCode = errorCode);
}
//Delete article
deleteArticle(articleId: string) {
this.preProcessConfigurations();
this.articleService.deleteArticleById(articleId)
     .subscribe(successCode => {
         //this.statusCode = successCode;
                    //Expecting success code 204 from server
                    this.statusCode = 204;
                 this.getAllArticles();  
                 this.backToCreateArticle();
             },
         errorCode => this.statusCode = errorCode);
}
//Perform preliminary processing configurations
preProcessConfigurations() {
this.statusCode = null;
     this.requestProcessing = true;
}
//Go back from update to create
backToCreateArticle() {
this.articleIdToUpdate = null;
this.articleForm.reset(); 
     this.processValidation = false;
}
}
</pre>

Now we have to show the task over browser, So lets have a look inside article.component.html file.

<pre class="ql-syntax" spellcheck="false"><h1 class="text-center">Angular 7 CRUD Demo App</h1>
<h3 class="text-center" *ngIf="articleIdToUpdate; else create">
Update Article for Id: {{articleIdToUpdate}}
</h3>
<ng-template #create>
<h3 class="text-center"> Create New Article </h3>
</ng-template>
<div>
<form [formGroup]="articleForm" (ngSubmit)="onArticleFormSubmit()">
<table class="table-striped" style="margin:0 auto;">
<tr><td>Enter Title</td><td><input formControlName="title">
   <label *ngIf="articleForm.get('title').invalid && processValidation" [ngClass] = "'error'"> Title is required. </label>
 </td></tr>
<tr><td>Enter Category</td><td><input formControlName="category">
   <label *ngIf="articleForm.get('category').invalid && processValidation" [ngClass] = "'error'">Category is required. </label>
  </td></tr>  
<tr><td colspan="2">
   <button class="btn btn-default" *ngIf="!articleIdToUpdate">CREATE</button>
    <button class="btn btn-default" *ngIf="articleIdToUpdate">UPDATE</button>
   <button (click)="backToCreateArticle()" *ngIf="articleIdToUpdate">Go Back</button>
  </td></tr>
</table>
</form>
<br/>
<div class="text-center" *ngIf="statusCode; else processing">
<div *ngIf="statusCode === 201" [ngClass] = "'success'">
   Article added successfully.
</div>
<div *ngIf="statusCode === 409" [ngClass] = "'success'">
Article already exists.
</div>   
<div *ngIf="statusCode === 200" [ngClass] = "'success'">
Article updated successfully.
</div>   
<div *ngIf="statusCode === 204" [ngClass] = "'success'">
Article deleted successfully.
</div>   
<div *ngIf="statusCode === 500" [ngClass] = "'error'">
Internal Server Error.
</div> 
</div>
<ng-template #processing>
  <img *ngIf="requestProcessing" src="assets/images/loading.gif">
</ng-template>
</div>
<h3 class="text-center">Article List</h3>
<table class="table-striped" style="margin:0 auto;" *ngIf="allArticles">
<tr><th> Id</th> <th>Title</th><th>Category</th><th></th><th></th></tr>
<tr *ngFor="let article of allArticles" >
<td>{{article.id}}</td> <td>{{article.title}}</td> <td>{{article.category}}</td>
  <td><button class="btn btn-default" type="button"(click)="loadArticleToEdit(article.id)">Edit</button> </td>
  <td><button class="btn btn-default" type="button"(click)="deleteArticle(article.id)">Delete</button></td>
</tr>
</table>
</pre>

Now since I have created server and client two separate folder for nodejs and angular task. So will run both the apps with npm start over two tabs of terminal.

On the browser, over link <a href="http://localhost:4200." target="_blank">http://localhost:4200.</a> App will look like below

 

That’s all for now. Thank you for reading and I hope this post will be very helpful for creating CRUD operations with angular7,node.js & mysql.



================================================

Thanks for reading :heart: If you liked this post, share it with all of your programming buddies! Follow me on <a href="https://l.morioh.com/b0a3f595aa?r=https://www.facebook.com/angular4u" title="" target="_blank">Facebook</a> | <a href="https://l.morioh.com/b0a3f595aa?r=https://twitter.com/codek_tv" title="" target="_blank">Twitter</a>

Learn More

☞ <a href="https://l.morioh.com/b0a3f595aa?r=http://learnstartup.net/p/H1jE_tD3l" title="" target="_blank">Angular 8 (formerly Angular 2) - The Complete Guide</a>

☞ <a href="https://l.morioh.com/b0a3f595aa?r=http://learnstartup.net/p/HJigQzgZx" title="" target="_blank">Learn and Understand AngularJS</a>

☞ <a href="https://l.morioh.com/b0a3f595aa?r=http://learnstartup.net/p/ry7Ey9yKW" title="" target="_blank">The Complete Angular Course: Beginner to Advanced</a>

☞ <a href="https://l.morioh.com/b0a3f595aa?r=http://learnstartup.net/p/rkjpGfa5W" title="" target="_blank">Angular Crash Course for Busy Developers</a>

☞ <a href="https://l.morioh.com/b0a3f595aa?r=http://learnstartup.net/p/SkdU19JFZ" title="" target="_blank">Angular Essentials (Angular 2+ with TypeScript)</a>

☞ <a href="https://l.morioh.com/b0a3f595aa?r=http://learnstartup.net/p/B1oHaY8cM" title="" target="_blank">Angular (Full App) with Angular Material, Angularfire & NgRx</a>

☞ <a href="https://l.morioh.com/b0a3f595aa?r=http://learnstartup.net/p/Skf7ILFw3l" title="" target="_blank">Angular & NodeJS - The MEAN Stack Guide</a>



Docker + Jupyter for Machine Learning

Docker + Jupyter for Machine Learning

The best practice for setting up such a container is using a docker file, which I have written following the best practices in less than 1 minute. I hope this would help anyone engaging in data science applications with docker.
The project is structured as follows

├── Prject_folder           
│ ├── Dockerfile           #Primary imga building 
│ ├── docker-compose.yml   #Describing the files to mount etc     
│ ├── requirements.txt     #Required python packages for the image

Step 1:

Download the following 3 scripts to your project directory on your computer. The GitHub repo can be found here.

FROM "ubuntu:bionic"

MAINTAINER [email protected]

RUN useradd -ms /bin/bash docker
RUN su docker

ENV LOG_DIR_DOCKER="/root/dockerLogs"
ENV LOG_INSTALL_DOCKER="/root/dockerLogs/install-logs.log"

RUN mkdir -p ${LOG_DIR_DOCKER} \
 && touch ${LOG_INSTALL_DOCKER}  \
 && echo "Logs directory and file created"  | sed -e "s/^/$(date +%Y%m%d-%H%M%S) :  /" 2>&1 | tee -a ${LOG_INSTALL_DOCKER}

RUN apt-get update | sed -e "s/^/$(date +%Y%m%d-%H%M%S) :  /" 2>&1 | tee -a ${LOG_INSTALL_DOCKER} \
  && apt-get install -y python3-pip python3-dev | sed -e "s/^/$(date +%Y%m%d-%H%M%S) :  /" 2>&1 | tee -a ${LOG_INSTALL_DOCKER} \
  && ln -s /usr/bin/python3 /usr/local/bin/python | sed -e "s/^/$(date +%Y%m%d-%H%M%S) :  /" 2>&1 | tee -a ${LOG_INSTALL_DOCKER} \
  && pip3 install --upgrade pip | sed -e "s/^/$(date +%Y%m%d-%H%M%S) :  /" 2>&1 | tee -a ${LOG_INSTALL_DOCKER}

COPY requirements.txt /root/datascience/requirements.txt
WORKDIR /root/datascience
RUN pip3 install -r requirements.txt
 
RUN jupyter notebook --generate-config --allow-root
RUN echo "c.NotebookApp.password = u'sha1:6a3f528eec40:6e896b6e4828f525a6e20e5411cd1c8075d68619'" >> /root/.jupyter/jupyter_notebook_config.py

CMD ["jupyter", "notebook", "--allow-root", "--notebook-dir=.", "--ip=0.0.0.0", "--port=8888", "--no-browser"]

Dockerfile

object_detection:
  image: datascience
  container_name: datascience
  restart: always
  environment:
     - TERM=xterm
  hostname: '127.0.0.1'
  ports:
     - "8888:8888"         #JupyterNB

  volumes:
     - /Users/chamalgomes/Documents/Python/GitHub/medium/JupyterWithDocker/test.txt:/root/datascience/test.txt

docker-compose.yml

Update it as {absolutePATH_to_yourFile}/{fileNameame}:/root/datascience/{fileName} to mount you preferred files.

tensorflow==1.14.0 
jupyter==1.0.0

requirements.txt

Update it to include the packages you require i.e matplotlib==1.0.0

Step 2:

Install Docker on your computer https://docs.docker.com/install/,

Step 3:

docker build -t datascience .

Step 4:

Update the docker-compose.yml file to mount files you require from your host to the container following the example I have given already with the test.txt file.

Step 5:

docker-compose up 

DONE Now Visit http://localhost:8888, The default password is set as “root”. Feel free to change it.

Addendum

If you run into trouble run the following command to download the log file to your current directory. Post this as a response to this article and I will get back to with a solution as soon as possible.

docker cp <container-name>:/root/dockerLogs/install-logs.log .

If you want to start an interactive session inside the container type in the following command

docker exec -it <container name> /bin/bash

If you want delete the image completely from your computer

Step 1: Stop the container

docker container stop <container-id>

Step 2: Remove the container

docker container rm <container-id>

Step 3: Delete the image

docker image rm datascience

Thanks for reading !

Originally published by Chamal Gomes at towardsdatascience.com

How to building Python Data Science Container using Docker

How to building Python Data Science Container using Docker

Artificial Intelligence(AI) and Machine Learning(ML) are literally on fire these days. Powering a wide spectrum of use-cases ranging from self-driving cars to drug discovery and to God knows what. AI and ML have a bright and thriving future ahead of them.

On the other hand, Docker revolutionized the computing world through the introduction of ephemeral lightweight containers. Containers basically package all the software required to run inside an image(a bunch of read-only layers) with a COW(Copy On Write) layer to persist the data.

Python Data Science Packages

Our Python data science container makes use of the following super cool python packages:

  1. NumPy: NumPy or Numeric Python supports large, multi-dimensional arrays and matrices. It provides fast precompiled functions for mathematical and numerical routines. In addition, NumPy optimizes Python programming with powerful data structures for efficient computation of multi-dimensional arrays and matrices.

  2. SciPy: SciPy provides useful functions for regression, minimization, Fourier-transformation, and many more. Based on NumPy, SciPy extends its capabilities. SciPy’s main data structure is again a multidimensional array, implemented by Numpy. The package contains tools that help with solving linear algebra, probability theory, integral calculus, and many more tasks.

  3. Pandas: Pandas offer versatile and powerful tools for manipulating data structures and performing extensive data analysis. It works well with incomplete, unstructured, and unordered real-world data — and comes with tools for shaping, aggregating, analyzing, and visualizing datasets.

  4. SciKit-Learn: Scikit-learn is a Python module integrating a wide range of state-of-the-art machine learning algorithms for medium-scale supervised and unsupervised problems. It is one of the best-known machine-learning libraries for python. The Scikit-learn package focuses on bringing machine learning to non-specialists using a general-purpose high-level language. The primary emphasis is upon ease of use, performance, documentation, and API consistency. With minimal dependencies and easy distribution under the simplified BSD license, SciKit-Learn is widely used in academic and commercial settings. Scikit-learn exposes a concise and consistent interface to the common machine learning algorithms, making it simple to bring ML into production systems.

  5. Matplotlib: Matplotlib is a Python 2D plotting library, capable of producing publication quality figures in a wide variety of hardcopy formats and interactive environments across platforms. Matplotlib can be used in Python scripts, the Python and IPython shell, the Jupyter notebook, web application servers, and four graphical user interface toolkits.

  6. NLTK: NLTK is a leading platform for building Python programs to work with human language data. It provides easy-to-use interfaces to over 50 corpora and lexical resources such as WordNet, along with a suite of text processing libraries for classification, tokenization, stemming, tagging, parsing, and semantic reasoning.

Building the Data Science Container

Python is fast becoming the go-to language for data scientists and for this reason we are going to use Python as the language of choice for building our data science container.

The Base Alpine Linux Image

Alpine Linux is a tiny Linux distribution designed for power users who appreciate security, simplicity and resource efficiency.

As claimed by Alpine:

Small. Simple. Secure. Alpine Linux is a security-oriented, lightweight Linux distribution based on musl libc and busybox.

The Alpine image is surprisingly tiny with a size of no more than 8MB for containers. With minimal packages installed to reduce the attack surface on the underlying container. This makes Alpine an image of choice for our data science container.

Downloading and Running an Alpine Linux container is as simple as:

$ docker container run --rm alpine:latest cat /etc/os-release

In our, Dockerfile we can simply use the Alpine base image as:

FROM alpine:latest

Talk is cheap let’s build the Dockerfile

Now let’s work our way through the Dockerfile.

The FROM directive is used to set alpine:latest as the base image. Using the WORKDIR directive we set the /var/www as the working directory for our container. The ENV PACKAGES lists the software packages required for our container like git, blas and libgfortran. The python packages for our data science container are defined in the ENV PACKAGES.

We have combined all the commands under a single Dockerfile RUN directive to reduce the number of layers which in turn helps in reducing the resultant image size.

Building and tagging the image

Now that we have our Dockerfile defined, navigate to the folder with the Dockerfile using the terminal and build the image using the following command:

$ docker build -t faizanbashir/python-datascience:2.7 -f Dockerfile .

The -t flag is used to name a tag in the 'name:tag' format. The -f tag is used to define the name of the Dockerfile (Default is 'PATH/Dockerfile').

Running the container

We have successfully built and tagged the docker image, now we can run the container using the following command:

$ docker container run --rm -it faizanbashir/python-datascience:2.7 python

Voila, we are greeted by the sight of a python shell ready to perform all kinds of cool data science stuff.

Python 2.7.15 (default, Aug 16 2018, 14:17:09) [GCC 6.4.0] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>>

Our container comes with Python 2.7, but don’t be sad if you wanna work with Python 3.6. Lo, behold the Dockerfile for Python 3.6:

Build and tag the image like so:

$ docker build -t faizanbashir/python-datascience:3.6 -f Dockerfile .

Run the container like so:

$ docker container run --rm -it faizanbashir/python-datascience:3.6 python

With this, you have a ready to use container for doing all kinds of cool data science stuff.

Serving Puddin’

Figures, you have the time and resources to set up all this stuff. In case you don’t, you can pull the existing images that I have already built and pushed to Docker’s registry Docker Hub using:

# For Python 2.7 pull
$ docker pull faizanbashir/python-datascience:2.7# For Python 3.6 pull
$ docker pull faizanbashir/python-datascience:3.6

After pulling the images you can use the image or extend the same in your Dockerfile file or use it as an image in your docker-compose or stack file.

Aftermath

The world of AI, ML is getting pretty exciting these days and will continue to become even more exciting. Big players are investing heavily in these domains. About time you start to harness the power of data, who knows it might lead to something wonderful.

You can check out the code here.

The perfect architecture flow for your next Node.js project

The perfect architecture flow for your next Node.js project

A good start is half the battle, said someone wiser than me. And I can’t think of any quote that would better describe the situation every developer gets into whenever starting a new project. Laying out a project’s structure in a practical way is one of the hardest points of the development process and, indeed, a delicate one.

We can define a path about discussing Node.js technologies, how to choose what front-end framework to use, and now we can try to dig deeper on how to structure our web apps once we have decided on the tech stack to use.

The importance of good architecture

Having a good starting point when it comes to our project architecture is vital for the life of the project itself and how you will be able to tackle changing needs in the future. A bad, messy project architecture often leads to:

  • Unreadable and messy code, making the development process longer and the product itself harder to test

  • Useless repetition, making code harder to maintain and manage

  • Difficulty implementing new features. Since the structure can become a total mess, adding a new feature without messing up existing code can become a real problem

With these points in mind, we can all agree that our project architecture is extremely important, and we can also declare a few points that can help us determine what this architecture must help us do:

  • Achieve clean and readable code

  • Achieve reusable pieces of code across our application

  • Help us to avoid repetitions

  • Make life easier when adding a new feature into our application

Establishing a flow

Now we can discuss what I usually refer to as the application structure flow. The application structure flow is a set of rules and common practices to adopt while developing our applications. These are the results of years of experience working with a technology and understanding what works properly and what doesn’t.

The goal of this article is to create a quick reference guide to establishing the perfect flow structure when developing Node.js applications. Let’s start to define our rules:

Rule #1: Correctly organize our files into folders

Everything has to have its place in our application, and a folder is the perfect place to group common elements. In particular, we want to define a very important separation, which brings us to rule number #2:

Rule #2: Keep a clear separation between the business logic and the API routes

See, frameworks like Express.js are amazing. They provide us with incredible features for managing requests, views, and routes. With such support, it might be tempting for us to put our business logic into our API routes. But this will quickly make them into giant, monolithic blocks that will reveal themselves to be unmanageable, hard to read, and prone to decomposition.

Please also don’t forget about how the testability of our application will decrease, with consequently longer development times. At this point, you might be wondering, “How do we solve this problem, then? Where can I put my business logic in a clear and intelligent way?” The answer is revealed in rule number #3.

Rule #3: Use a service layer

This is the place where all our business logic should live. It’s basically a collection of classes, each with its methods, that will be implementing our app’s core logic. The only part you should ignore in this layer is the one that accesses the database; that should be managed by the data access layer.

Now that we have defined these three initial rules, we can graphically represent the result like this:
This is image title
Separating our business logic from our API routes.

And the subsequent folder structure sending us back to rule #1 can then become:
This is image title

By looking at this last image, we can also establish two other rules when thinking about our structure.

Rule #4: Use a config folder for configuration files

This is image title

Rule #5: Have a scripts folder for long npm scripts

This is image title

Rule #6: Use dependency injection

Node.js is literally packed with amazing features and tools to make our lives easier. However, as we know, working with dependencies can be quite troublesome most of the time due to problems that can arise with testability and code manageability.

There is a solution for that, and it’s called dependency injection.

Dependency injection is a software design pattern in which one or more dependencies (or services) are injected, or passed by reference, into a dependent object.

By using this inside our Node applications, we:

  • Have an easier unit testing process, passing dependencies directly to the modules we would like to use instead of hardcoding them

  • Avoid useless modules coupling, making maintenance much easier

  • Provide a faster git flow. After we defined our interfaces, they will stay like that, so we can avoid any merge conflicts.
    This is image title
    Using Node.js without dependency injection.

Simple but still not very flexible as an approach to our code. What happens if we want to alter this test to use an example database? We should alter our code to adapt it to this new need. Why not pass the database directly as a dependency instead?
This is image title

Rule #7: Use unit testing

Now that we know we have got dependency injection under our belt, we can also implement unit testing for our project. Testing is an incredibly important stage in developing our applications. The whole flow of the project — not just the final result — depends on it since buggy code would slow down the development process and cause other problems.

A common way to test our applications is to test them by units, the goal of which is to isolate a section of code and verify its correctness. When it comes to procedural programming, a unit may be an individual function or procedure. This process is usually performed by the developers who write the code.

Benefits of this approach include:

Improved code quality

Unit testing improves the quality of your code, helping you to identify problems you might have missed before the code goes on to other stages of development. It will expose the edge cases and makes you write better overall code

Bugs are found earlier

Issues here are found at a very early stage. Since the tests are going to be performed by the developer who wrote the code, bugs will be found earlier, and you will be able to avoid the extremely time-consuming process of debugging

Cost reduction

Fewer flaws in the application means less time spent debugging it, and less time spent debugging it means less money spent on the project. Time here is an especially critical factor since this precious unit can now be allocated to develop new features for our product

Rule #8: Use another layer for third-party services calls

Often, in our application, we may want to call a third-party service to retrieve certain data or perform some operations. And still, very often, if we don’t separate this call into another specific layer, we might run into an out-of-control piece of code that has become too big to manage.

A common way to solve this problem is to use the pub/sub pattern. This mechanism is a messaging pattern where we have entities sending messages called publishers, and entities receiving them called subscribers.

Publishers won’t program the messages to be sent directly to specific receivers. Instead, they will categorize published messages into specific classes without knowledge of which subscribers, if any, may be dealing with them.

In a similar way, the subscribers will express interest in dealing with one or more classes and only receive messages that are of interest to them — all without knowledge of which publishers are out there.

The publish-subscribe model enables event-driven architectures and asynchronous parallel processing while improving performance, reliability, and scalability.

Rule #9: Use a linter

This simple tool will help you to perform a faster and overall better development process, helping you to keep an eye on small errors while keeping the entire application code uniform.
This is image title
Example of using a linter.

Rule #10: Use a style guide

Still thinking about how to properly format your code in a consistent way? Why not adapt one of the amazing style guides that Google or Airbnb have provided to us? Reading code will become incredibly easier, and you won’t get frustrated trying to understand how to correctly position that curly brace.
This is image title
Google’s JavaScript style guide.

Rule #11: Always comment your code

Writing a difficult piece of code where it’s difficult to understand what you are doing and, most of all, why? Never forget to comment it. This will become extremely useful for your fellow developers and to your future self, all of whom will be wondering why exactly you did something six months after you first wrote it.

Rule #12: Keep an eye on your file sizes

Files that are too long are extremely hard to manage and maintain. Always keep an eye on your file length, and if they become too long, try to split them into modules packed in a folder as files that are related together.

Rule #13: Always use gzip compression

The server can use gzip compression to reduce file sizes before sending them to a web browser. This will reduce latency and lag.
This is image title
An example of using gzip compression with Express.

Rule #14: Use promises

Using callbacks is the simplest possible mechanism for handling your asynchronous code in JavaScript. However, raw callbacks often sacrifice the application control flow, error handling, and semantics that were so familiar to us when using synchronous code. A solution for that is using promises in Node.js.

Promises bring in more pros than cons by making our code easier to read and test while still providing functional programming semantics together with a better error-handling platform.
This is image title
A basic example of a promise.

Rule #15: Use promises’ error handling support

Finding yourself in a situation where you have an unexpected error or behavior in your app is not at all pleasant, I can guarantee. Errors are impossible to avoid when writing our code. That’s simply part of being human.

Dealing with them is our responsibility, and we should always not only use promises in our applications, but also make use of their error handling support provided by the catch keyword.
This is image title

Conclusion

Creating a Node.js application can be challenging, I hope this set of rules helped you to put yourself in the right direction when establishing what type of architecture you are going to use, and what practices are going to support that architecture.

Originally published by Piero Borrelli at blog.logrocket.com