Bulletproof node.js project architecture

Bulletproof node.js project architecture

This article is extense research, from my years of experience dealing with a poor structured node.js project, bad patterns, and countless hours of refactoring code and moving things around.

Introduction

Express.js is great frameworks for making a node.js REST APIs however it doesn’t give you any clue on how to organizing your node.js project.

While it may sound silly, this is a real problem.

The correct organization of your node.js project structure will avoid duplication of code, will improve stability, and potentially, will help you scale your services if is done correctly.


Table of contents
  • The folder structure 🏢
  • 3 Layer architecture 🥪
  • Service Layer 💼
  • Pub/Sub Layer ️️️️🎙️️
  • Dependency Injection 💉
  • Unit Testing 🕵🏻
  • Cron Jobs and recurring task ⚡
  • Configurations and secrets 🤫
  • Loaders 🏗️
  • Example Repository



The folder structure 🏢

Here is the node.js project structure that I’m talking about.

I use this in every node.js REST API service that I build, let’s see in details what every component do.

  src
  │   app.js          # App entry point
  └───api             # Express route controllers for all the endpoints of the app
  └───config          # Environment variables and configuration related stuff
  └───jobs            # Jobs definitions for agenda.js
  └───loaders         # Split the startup process into modules
  └───models          # Database models
  └───services        # All the business logic is here
  └───subscribers     # Event handlers for async task
  └───types           # Type declaration files (d.ts) for Typescript

It is more than just a way of ordering javascript files…



3 Layer architecture 🥪

The idea is to use the principle of separation of concerns to move the business logic away from the node.js API Routes.

Because someday, you will want to use your business logic on a CLI tool, or not going far, in a recurring task.

And make an API call from the node.js server to itself it’s not a good idea…


☠️ Don’t put your business logic inside the controllers!! ☠️

You may be tempted to just use the express.js controllers to store the business logic of your application, but this quickly becomes spaghetti code, as soon as you need to write unit tests, you will end up dealing with complex mocks for req or res express.js objects.

It’s complicated to distingue when a response should be sent, and when to continue processing in ‘background’, let’s say after the response is sent to the client.

Here is an example of what not to do.

  route.post('/', async (req, res, next) => {

// This should be a middleware or should be handled by a library like Joi.
const userDTO = req.body;
const isUserValid = validators.user(userDTO)
if(!isUserValid) {
  return res.status(400).end();
}

// Lot of business logic here...
const userRecord = await UserModel.create(userDTO);
delete userRecord.password;
delete userRecord.salt;
const companyRecord = await CompanyModel.create(userRecord);
const companyDashboard = await CompanyDashboard.create(userRecord, companyRecord);

...whatever...

// And here is the 'optimization' that mess up everything.
// The response is sent to client...
res.json({ user: userRecord, company: companyRecord });

// But code execution continues :(
const salaryRecord = await SalaryModel.create(userRecord, companyRecord);
eventTracker.track('user_signup',userRecord,companyRecord,salaryRecord);
intercom.createUser(userRecord);
gaAnalytics.event('user_signup',userRecord);
await EmailService.startSignupSequence(userRecord)

});



Use a service layer for your business logic 💼

This layer is where your business logic should live.

It’s just a collection of classes with clear porpuses, following the SOLID principles applied to node.js.

In this layer there should not exists any form of ‘SQL query’, use the data access layer for that.

  • Move your code away from the express.js router
  • Don’t pass the req or res object to the service layer
  • Don’t return anything related to the HTTP transport layer like a status code or headers from the service layer.

Example

  route.post('/',
validators.userSignup, // this middleware take care of validation
async (req, res, next) => {
// The actual responsability of the route layer.
const userDTO = req.body;

  // Call to service layer.
  // Abstraction on how to access the data layer and the business logic.
  const { user, company } = await UserService.Signup(userDTO);

  // Return a response to client.
  return res.json({ user, company });
});

Here is how your service will be working behind the scenes.

  import UserModel from '../models/user';
import CompanyModel from '../models/company';

export default class UserService() {

async Signup(user) {
  const userRecord = await UserModel.create(user);
  const companyRecord = await CompanyModel.create(userRecord); // needs userRecord to have the database id 
  const salaryRecord = await SalaryModel.create(userRecord, companyRecord); // depends on user and company to be created

  ...whatever

  await EmailService.startSignupSequence(userRecord)

  ...do more stuff

  return { user: userRecord, company: companyRecord };
}

}

Visit the example repository



Use a Pub/Sub layer too 🎙️

The pub/sub pattern goes beyond the classic 3 layer architecture proposed here but it’s extremely useful.

The simple node.js API endpoint that creates a user right now, may want to call third-party services, maybe to an analytics service, or maybe start an email sequence.

Sooner than later, that simple “create” operation will be doing several things, and you will end up with 1000 lines of code, all in a single function.

That violates the principle of single responsibility.

So, it’s better to separate responsibilities from the start, so your code remains maintainable.

  import UserModel from '../models/user';
import CompanyModel from '../models/company';
import SalaryModel from '../models/salary';

export default class UserService() {

async Signup(user) {
  const userRecord = await UserModel.create(user);
  const companyRecord = await CompanyModel.create(user);
  const salaryRecord = await SalaryModel.create(user, salary);

  eventTracker.track(
    'user_signup',
    userRecord,
    companyRecord,
    salaryRecord
  );

  intercom.createUser(
    userRecord
  );

  gaAnalytics.event(
    'user_signup',
    userRecord
  );

  await EmailService.startSignupSequence(userRecord)

  ...more stuff

  return { user: userRecord, company: companyRecord };
}

}

An imperative call to a dependent service is not the best way of doing it.

A better approach is by emitting an event i.e. ‘a user signed up with this email’.

And you are done, now it’s the responsibility of the listeners to do their job.

  import UserModel from '../models/user';
import CompanyModel from '../models/company';
import SalaryModel from '../models/salary';

export default class UserService() {

async Signup(user) {
  const userRecord = await this.userModel.create(user);
  const companyRecord = await this.companyModel.create(user);
  this.eventEmitter.emit('user_signup', { user: userRecord, company: companyRecord })
  return userRecord
}

}

Now you can split the event handlers/listeners into multiple files.

  eventEmitter.on('user_signup', ({ user, company }) => {

eventTracker.track(
  'user_signup',
  user,
  company,
);

intercom.createUser(
  user
);

gaAnalytics.event(
  'user_signup',
  user
);

})

eventEmitter.on('user_signup', async ({ user, company }) => {
const salaryRecord = await SalaryModel.create(user, company);
})

eventEmitter.on('user_signup', async ({ user, company }) => {
await EmailService.startSignupSequence(user)
})

You can wrap the await statements into a try-catch block or you can just let it fail and handle the ‘unhandledPromise’ process.on(‘unhandledRejection’,cb)



Dependency Injection 💉

D.I. or inversion of control (IoC) is a common pattern that will help the organization of your code, by ‘injecting’ or passing through the constructor the dependencies of your class or function.

By doing this way you will gain the flexibility to inject a ‘compatible dependency’ when, for example, you write the unit tests for the service, or when the service is used in another context.

Code with no D.I

  import UserModel from '../models/user';
import CompanyModel from '../models/company';
import SalaryModel from '../models/salary';
class UserService {
constructor(){}
Sigup(){
// Caling UserMode, CompanyModel, etc
...
}
}

Code with manual dependency injection

  export default class UserService {
constructor(userModel, companyModel, salaryModel){
this.userModel = userModel;
this.companyModel = companyModel;
this.salaryModel = salaryModel;
}
getMyUser(userId){
// models available throug 'this'
const user = this.userModel.findById(userId);
return user;
}
}

Now you can inject custom dependencies.

  import UserService from '../services/user';
import UserModel from '../models/user';
import CompanyModel from '../models/company';
const salaryModelMock = {
calculateNetSalary(){
return 42;
}
}
const userServiceInstance = new UserService(userModel, companyModel, salaryModelMock);
const user = await userServiceInstance.getMyUser('12346');

The amount of dependencies a service can have is infinite, and refactor every instantiation of it when you add a new one is a boring and error-prone task.

That’s why dependency injection frameworks were created.

The idea is you declare your dependencies in the class, and when you need an instance of that class, you just call the ‘Service Locator’.

Let’s see an example using typedi an npm library that brings D.I to node.js

You can read more on how to use typedi in the official documentation

WARNING typescript example

  import { Service } from 'typedi';
@Service()
export default class UserService {
constructor(
private userModel,
private companyModel,
private salaryModel
){}

getMyUser(userId){
  const user = this.userModel.findById(userId);
  return user;
}

}

services/user.ts

Now typedi will take care of resolving any dependency the UserService require.

  import { Container } from 'typedi';
import UserService from '../services/user';
const userServiceInstance = Container.get(UserService);
const user = await userServiceInstance.getMyUser('12346');

Abusing service locator calls is an anti-pattern


Using Dependency Injection with Express.js in Node.js

Using D.I. in express.js is the final piece of the puzzle for this node.js project architecture.

Routing layer

  route.post('/',
async (req, res, next) => {
const userDTO = req.body;

  const userServiceInstance = Container.get(UserService) // Service locator

  const { user, company } = userServiceInstance.Signup(userDTO);

  return res.json({ user, company });
});

Awesome, project is looking great!

It’s so organized that makes me want to be coding something right now.

Visit the example repository



An unit test example 🕵🏻

By using dependency injection and these organization patterns, unit testing becomes really simple.

You don’t have to mock req/res objects or require(…) calls.

Example: Unit test for signup user method

tests/unit/services/user.js

  import UserService from '../../../src/services/user';

describe('User service unit tests', () => {
describe('Signup', () => {
test('Should create user record and emit user_signup event', async () => {
const eventEmitterService = {
emit: jest.fn(),
};

    const userModel = {
      create: (user) => {
        return {
          ...user,
          _id: 'mock-user-id'
        }
      },
    };

    const companyModel = {
      create: (user) => {
        return {
          owner: user._id,
          companyTaxId: '12345',
        }
      },
    };

    const userInput= {
      fullname: 'User Unit Test',
      email: '[email protected]',
    };

    const userService = new UserService(userModel, companyModel, eventEmitterService);
    const userRecord = await userService.SignUp(teamId.toHexString(), userInput);

    expect(userRecord).toBeDefined();
    expect(userRecord._id).toBeDefined();
    expect(eventEmitterService.emit).toBeCalled();
  });
})

})



Cron Jobs and recurring task ⚡

So, now that the business logic encapsulated into the service layer, it’s easier to use it from a Cron job.

You should never rely on node.js setTimeout or another primitive way of delay the execution of code, but on a framework that persist your jobs, and the execution of them, in a database.

This way you will have control over the failed jobs, and feedback of those who succeed.

I already wrote on good practice for this so, check my guide on using agenda.js the best task manager for node.js.



Configurations and secrets 🤫

Following the battle-tested concepts of Twelve-Factor App for node.js the best approach to store API Keys and database string connections, it’s by using dotenv.

Put a .env file, that must never be committed (but it has to exist with default values in your repository) then, the npm package dotenv loads the .env file and insert the vars into the process.env object of node.js.

That could be enough but, I like to add an extra step.

Have a config/index.ts file where the dotenv npm package and loads the .env file and then I use an object to store the variables, so we have a structure and code autocompletion.

config/index.js

  const dotenv = require('dotenv');
// config() will read your .env file, parse the contents, assign it to process.env.
dotenv.config();

export default {
port: process.env.PORT,
databaseURL: process.env.DATABASE_URI,
paypal: {
publicKey: process.env.PAYPAL_PUBLIC_KEY,
secretKey: process.env.PAYPAL_SECRET_KEY,
},
paypal: {
publicKey: process.env.PAYPAL_PUBLIC_KEY,
secretKey: process.env.PAYPAL_SECRET_KEY,
},
mailchimp: {
apiKey: process.env.MAILCHIMP_API_KEY,
sender: process.env.MAILCHIMP_SENDER,
}
}

This way you avoid flooding your code with process.env.MY_RANDOM_VAR instructions, and by having the autocompletion you don’t have to know how to name the env var.

Visit the example repository



Loaders 🏗️

I took this pattern from W3Tech microframework but without depending upon their package.

The idea is that you split the startup process of your node.js service into testable modules.

Let’s see a classic express.js app initialization

  const mongoose = require('mongoose');
const express = require('express');
const bodyParser = require('body-parser');
const session = require('express-session');
const cors = require('cors');
const errorhandler = require('errorhandler');
const app = express();

app.get('/status', (req, res) => { res.status(200).end(); });
app.head('/status', (req, res) => { res.status(200).end(); });
app.use(cors());
app.use(require('morgan')('dev'));
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json(setupForStripeWebhooks));
app.use(require('method-override')());
app.use(express.static(__dirname + '/public'));
app.use(session({ secret: process.env.SECRET, cookie: { maxAge: 60000 }, resave: false, saveUninitialized: false }));
mongoose.connect(process.env.DATABASE_URL, { useNewUrlParser: true });

require('./config/passport');
require('./models/user');
require('./models/company');
app.use(require('./routes'));
app.use((req, res, next) => {
var err = new Error('Not Found');
err.status = 404;
next(err);
});
app.use((err, req, res) => {
res.status(err.status || 500);
res.json({'errors': {
message: err.message,
error: {}
}});
});

... more stuff

... maybe start up Redis

... maybe add more middlewares

async function startServer() {
app.listen(process.env.PORT, err => {
if (err) {
console.log(err);
return;
}
console.log(Your server is ready !);
});
}

// Run the async function to start our server
startServer();

As you see, this part of your application can be a real mess.

Here is an effective way to deal with it.

  const loaders = require('./loaders');
const express = require('express');

async function startServer() {

const app = express();

await loaders.init({ expressApp: app });

app.listen(process.env.PORT, err => {
  if (err) {
    console.log(err);
    return;
  }
  console.log(`Your server is ready !`);
});

}

startServer();

Now the loaders are just tiny files with a concise purpose

loaders/index.js

  import expressLoader from './express';
import mongooseLoader from './mongoose';

export default async ({ expressApp }) => {
const mongoConnection = await mongooseLoader();
console.log('MongoDB Intialized');
await expressLoader({ app: expressApp });
console.log('Express Intialized');

// ... more loaders can be here

// ... Initialize agenda
// ... or Redis, or whatever you want

}

The express loader

loaders/express.js

  import * as express from 'express';
import * as bodyParser from 'body-parser';
import * as cors from 'cors';

export default async ({ app }: { app: express.Application }) => {

app.get('/status', (req, res) => { res.status(200).end(); });
app.head('/status', (req, res) => { res.status(200).end(); });
app.enable('trust proxy');

app.use(cors());
app.use(require('morgan')('dev'));
app.use(bodyParser.urlencoded({ extended: false }));

// ...More middlewares

// Return the express app
return app;

})

The mongo loader

loaders/mongoose.js

  import * as mongoose from 'mongoose'
export default async (): Promise<any> => {
const connection = await mongoose.connect(process.env.DATABASE_URL, { useNewUrlParser: true });
return connection.connection.db;
}

See a complete example of loaders here



Conclusion

We deep dive into a production tested node.js project structure, here are some summarized tips:

  • Use a 3 layer architecture.
  • Don’t put your business logic into the express.js controllers.
  • Use PubSub pattern and emit events for background tasks.
  • Have dependency injection for your peace of mind.
  • Never leak your passwords, secrets and API keys, use a configuration manager.
  • Split your node.js server configurations into small modules that can be loaded independently.

Thanks for reading ❤

If you liked this post, share it with all of your programming buddies!

Follow me on Facebook | Twitter

Learn More

The Complete Node.js Developer Course (3rd Edition)

Angular & NodeJS - The MEAN Stack Guide

NodeJS - The Complete Guide (incl. MVC, REST APIs, GraphQL)

Node.js: The Complete Guide to Build RESTful APIs (2018)

Learn and Understand NodeJS

MERN Stack Front To Back: Full Stack React, Redux & Node.js

Node, Express, Angular 7, GraphQL and MongoDB CRUD Web App

Node, Express, React.js, Graphql and MongoDB CRUD Web Application

A to Z of Node.js Packages(unconventional)

Top Mistakes That Node.js Developers Make and Should be Avoided

Google’s Go Essentials For Node.js / JavaScript Developers

Angular + WebSocket + Node.js Express = RxJS WebSocketSubject ❤️

Originaly posted on softwareontheroad.com

Top 7 Most Popular Node.js Frameworks You Should Know

Top 7 Most Popular Node.js Frameworks You Should Know

Node.js is an open-source, cross-platform, runtime environment that allows developers to run JavaScript outside of a browser. In this post, you'll see top 7 of the most popular Node frameworks at this point in time (ranked from high to low by GitHub stars).

Node.js is an open-source, cross-platform, runtime environment that allows developers to run JavaScript outside of a browser.

One of the main advantages of Node is that it enables developers to use JavaScript on both the front-end and the back-end of an application. This not only makes the source code of any app cleaner and more consistent, but it significantly speeds up app development too, as developers only need to use one language.

Node is fast, scalable, and easy to get started with. Its default package manager is npm, which means it also sports the largest ecosystem of open-source libraries. Node is used by companies such as NASA, Uber, Netflix, and Walmart.

But Node doesn't come alone. It comes with a plethora of frameworks. A Node framework can be pictured as the external scaffolding that you can build your app in. These frameworks are built on top of Node and extend the technology's functionality, mostly by making apps easier to prototype and develop, while also making them faster and more scalable.

Below are 7of the most popular Node frameworks at this point in time (ranked from high to low by GitHub stars).

Express

With over 43,000 GitHub stars, Express is the most popular Node framework. It brands itself as a fast, unopinionated, and minimalist framework. Express acts as middleware: it helps set up and configure routes to send and receive requests between the front-end and the database of an app.

Express provides lightweight, powerful tools for HTTP servers. It's a great framework for single-page apps, websites, hybrids, or public HTTP APIs. It supports over fourteen different template engines, so developers aren't forced into any specific ORM.

Meteor

Meteor is a full-stack JavaScript platform. It allows developers to build real-time web apps, i.e. apps where code changes are pushed to all browsers and devices in real-time. Additionally, servers send data over the wire, instead of HTML. The client renders the data.

The project has over 41,000 GitHub stars and is built to power large projects. Meteor is used by companies such as Mazda, Honeywell, Qualcomm, and IKEA. It has excellent documentation and a strong community behind it.

Koa

Koa is built by the same team that built Express. It uses ES6 methods that allow developers to work without callbacks. Developers also have more control over error-handling. Koa has no middleware within its core, which means that developers have more control over configuration, but which means that traditional Node middleware (e.g. req, res, next) won't work with Koa.

Koa already has over 26,000 GitHub stars. The Express developers built Koa because they wanted a lighter framework that was more expressive and more robust than Express. You can find out more about the differences between Koa and Express here.

Sails

Sails is a real-time, MVC framework for Node that's built on Express. It supports auto-generated REST APIs and comes with an easy WebSocket integration.

The project has over 20,000 stars on GitHub and is compatible with almost all databases (MySQL, MongoDB, PostgreSQL, Redis). It's also compatible with most front-end technologies (Angular, iOS, Android, React, and even Windows Phone).

Nest

Nest has over 15,000 GitHub stars. It uses progressive JavaScript and is built with TypeScript, which means it comes with strong typing. It combines elements of object-oriented programming, functional programming, and functional reactive programming.

Nest is packaged in such a way it serves as a complete development kit for writing enterprise-level apps. The framework uses Express, but is compatible with a wide range of other libraries.

LoopBack

LoopBack is a framework that allows developers to quickly create REST APIs. It has an easy-to-use CLI wizard and allows developers to create models either on their schema or dynamically. It also has a built-in API explorer.

LoopBack has over 12,000 GitHub stars and is used by companies such as GoDaddy, Symantec, and the Bank of America. It's compatible with many REST services and a wide variety of databases (MongoDB, Oracle, MySQL, PostgreSQL).

Hapi

Similar to Express, hapi serves data by intermediating between server-side and client-side. As such, it's can serve as a substitute for Express. Hapi allows developers to focus on writing reusable app logic in a modular and prescriptive fashion.

The project has over 11,000 GitHub stars. It has built-in support for input validation, caching, authentication, and more. Hapi was originally developed to handle all of Walmart's mobile traffic during Black Friday.

Node.js for Beginners - Learn Node.js from Scratch (Step by Step)

Node.js for Beginners - Learn Node.js from Scratch (Step by Step)

Node.js for Beginners - Learn Node.js from Scratch (Step by Step) - Learn the basics of Node.js. This Node.js tutorial will guide you step by step so that you will learn basics and theory of every part. Learn to use Node.js like a professional. You’ll learn: Basic Of Node, Modules, NPM In Node, Event, Email, Uploading File, Advance Of Node.

Node.js for Beginners

Learn Node.js from Scratch (Step by Step)

Welcome to my course "Node.js for Beginners - Learn Node.js from Scratch". This course will guide you step by step so that you will learn basics and theory of every part. This course contain hands on example so that you can understand coding in Node.js better. If you have no previous knowledge or experience in Node.js, you will like that the course begins with Node.js basics. otherwise if you have few experience in programming in Node.js, this course can help you learn some new information . This course contain hands on practical examples without neglecting theory and basics. Learn to use Node.js like a professional. This comprehensive course will allow to work on the real world as an expert!
What you’ll learn:

  • Basic Of Node
  • Modules
  • NPM In Node
  • Event
  • Email
  • Uploading File
  • Advance Of Node

How to Use Express.js, Node.js and MongoDB.js

How to Use Express.js, Node.js and MongoDB.js

In this post, I will show you how to use Express.js, Node.js and MongoDB.js. We will be creating a very simple Node application, that will allow users to input data that they want to store in a MongoDB database. It will also show all items that have been entered into the database.

In this post, I will show you how to use Express.js, Node.js and MongoDB.js. We will be creating a very simple Node application, that will allow users to input data that they want to store in a MongoDB database. It will also show all items that have been entered into the database.

Creating a Node Application

To get started I would recommend creating a new database that will contain our application. For this demo I am creating a directory called node-demo. After creating the directory you will need to change into that directory.

mkdir node-demo
cd node-demo

Once we are in the directory we will need to create an application and we can do this by running the command
npm init

This will ask you a series of questions. Here are the answers I gave to the prompts.

The first step is to create a file that will contain our code for our Node.js server.

touch app.js

In our app.js we are going to add the following code to build a very simple Node.js Application.

var express = require("express");
var app = express();
var port = 3000;
 
app.get("/", (req, res) => {
&nbsp;&nbsp;res.send("Hello World");
});
 
app.listen(port, () => {
  console.log("Server listening on port " + port);
});

What the code does is require the express.js application. It then creates app by calling express. We define our port to be 3000.

The app.use line will listen to requests from the browser and will return the text “Hello World” back to the browser.

The last line actually starts the server and tells it to listen on port 3000.

Installing Express

Our app.js required the Express.js module. We need to install express in order for this to work properly. Go to your terminal and enter this command.

npm install express --save

This command will install the express module into our package.json. The module is installed as a dependency in our package.json as shown below.

To test our application you can go to the terminal and enter the command

node app.js

Open up a browser and navigate to the url http://localhost:3000

You will see the following in your browser

Creating Website to Save Data to MongoDB Database

Instead of showing the text “Hello World” when people view your application, what we want to do is to show a place for user to save data to the database.

We are going to allow users to enter a first name and a last name that we will be saving in the database.

To do this we will need to create a basic HTML file. In your terminal enter the following command to create an index.html file.

touch index.html

In our index.html file we will be creating an input filed where users can input data that they want to have stored in the database. We will also need a button for users to click on that will add the data to the database.

Here is what our index.html file looks like.

<!DOCTYPE html>
<html>
  <head>
    <title>Intro to Node and MongoDB<title>
  <head>

  <body>
    <h1>Into to Node and MongoDB<&#47;h1>
    <form method="post" action="/addname">
      <label>Enter Your Name<&#47;label><br>
      <input type="text" name="firstName" placeholder="Enter first name..." required>
      <input type="text" name="lastName" placeholder="Enter last name..." required>
      <input type="submit" value="Add Name">
    </form>
  <body>
<html>

If you are familiar with HTML, you will not find anything unusual in our code for our index.html file. We are creating a form where users can input their first name and last name and then click an “Add Name” button.

The form will do a post call to the /addname endpoint. We will be talking about endpoints and post later in this tutorial.

Displaying our Website to Users

We were previously displaying the text “Hello World” to users when they visited our website. Now we want to display our html file that we created. To do this we will need to change the app.use line our our app.js file.

We will be using the sendFile command to show the index.html file. We will need to tell the server exactly where to find the index.html file. We can do that by using a node global call __dirname. The __dirname will provide the current directly where the command was run. We will then append the path to our index.html file.

The app.use lines will need to be changed to
app.use("/", (req, res) => {   res.sendFile(__dirname + "/index.html"); });

Once you have saved your app.js file, we can test it by going to terminal and running node app.js

Open your browser and navigate to “http://localhost:3000”. You will see the following

Connecting to the Database

Now we need to add our database to the application. We will be connecting to a MongoDB database. I am assuming that you already have MongoDB installed and running on your computer.

To connect to the MongoDB database we are going to use a module called Mongoose. We will need to install mongoose module just like we did with express. Go to your terminal and enter the following command.
npm install mongoose --save

This will install the mongoose model and add it as a dependency in our package.json.

Connecting to the Database

Now that we have the mongoose module installed, we need to connect to the database in our app.js file. MongoDB, by default, runs on port 27017. You connect to the database by telling it the location of the database and the name of the database.

In our app.js file after the line for the port and before the app.use line, enter the following two lines to get access to mongoose and to connect to the database. For the database, I am going to use “node-demo”.

var mongoose = require("mongoose"); mongoose.Promise = global.Promise; mongoose.connect("mongodb://localhost:27017/node-demo");

Creating a Database Schema

Once the user enters data in the input field and clicks the add button, we want the contents of the input field to be stored in the database. In order to know the format of the data in the database, we need to have a Schema.

For this tutorial, we will need a very simple Schema that has only two fields. I am going to call the field firstName and lastName. The data stored in both fields will be a String.

After connecting to the database in our app.js we need to define our Schema. Here are the lines you need to add to the app.js.
var nameSchema = new mongoose.Schema({   firstName: String,   lastNameName: String });

Once we have built our Schema, we need to create a model from it. I am going to call my model “DataInput”. Here is the line you will add next to create our mode.
var User = mongoose.model("User", nameSchema);

Creating RESTful API

Now that we have a connection to our database, we need to create the mechanism by which data will be added to the database. This is done through our REST API. We will need to create an endpoint that will be used to send data to our server. Once the server receives this data then it will store the data in the database.

An endpoint is a route that our server will be listening to to get data from the browser. We already have one route that we have created already in the application and that is the route that is listening at the endpoint “/” which is the homepage of our application.

HTTP Verbs in a REST API

The communication between the client(the browser) and the server is done through an HTTP verb. The most common HTTP verbs are
GET, PUT, POST, and DELETE.

The following table explains what each HTTP verb does.

HTTP Verb Operation
GET Read
POST Create
PUT Update
DELETE Delete

As you can see from these verbs, they form the basis of CRUD operations that I talked about previously.

Building a CRUD endpoint

If you remember, the form in our index.html file used a post method to call this endpoint. We will now create this endpoint.

In our previous endpoint we used a “GET” http verb to display the index.html file. We are going to do something very similar but instead of using “GET”, we are going to use “POST”. To get started this is what the framework of our endpoint will look like.

app.post("/addname", (req, res) => {
 
});
Express Middleware

To fill out the contents of our endpoint, we want to store the firstName and lastName entered by the user into the database. The values for firstName and lastName are in the body of the request that we send to the server. We want to capture that data, convert it to JSON and store it into the database.

Express.js version 4 removed all middleware. To parse the data in the body we will need to add middleware into our application to provide this functionality. We will be using the body-parser module. We need to install it, so in your terminal window enter the following command.

npm install body-parser --save

Once it is installed, we will need to require this module and configure it. The configuration will allow us to pass the data for firstName and lastName in the body to the server. It can also convert that data into JSON format. This will be handy because we can take this formatted data and save it directly into our database.

To add the body-parser middleware to our application and configure it, we can add the following lines directly after the line that sets our port.

var bodyParser = require('body-parser');
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
Saving data to database

Mongoose provides a save function that will take a JSON object and store it in the database. Our body-parser middleware, will convert the user’s input into the JSON format for us.

To save the data into the database, we need to create a new instance of our model that we created early. We will pass into this instance the user’s input. Once we have it then we just need to enter the command “save”.

Mongoose will return a promise on a save to the database. A promise is what is returned when the save to the database completes. This save will either finish successfully or it will fail. A promise provides two methods that will handle both of these scenarios.

If this save to the database was successful it will return to the .then segment of the promise. In this case we want to send text back the user to let them know the data was saved to the database.

If it fails it will return to the .catch segment of the promise. In this case, we want to send text back to the user telling them the data was not saved to the database. It is best practice to also change the statusCode that is returned from the default 200 to a 400. A 400 statusCode signifies that the operation failed.

Now putting all of this together here is what our final endpoint will look like.

app.post("/addname", (req, res) => {
  var myData = new User(req.body);
  myData.save()
    .then(item => {
      res.send("item saved to database");
    })
    .catch(err => {
      res.status(400).send("unable to save to database");
    });
});
Testing our code

Save your code. Go to your terminal and enter the command node app.js to start our server. Open up your browser and navigate to the URL “http://localhost:3000”. You will see our index.html file displayed to you.

Make sure you have mongo running.

Enter your first name and last name in the input fields and then click the “Add Name” button. You should get back text that says the name has been saved to the database like below.

Access to Code

The final version of the code is available in my Github repo. To access the code click here. Thank you for reading !