How to create API authentication using JWT's and Passport ✌️

How to create API authentication using JWT's and Passport ✌️

Every web application and API uses a form of authentication to protect resources and restrict them to only verified users. We'll be going through how to create authentication for an API using JWT's (JSON Web Tokens) and a package passport. Let's take a brief introduction into how they work.

Every web application and API uses a form of authentication to protect resources and restrict them to only verified users. We'll be going through how to create authentication for an API using JWT's (JSON Web Tokens) and a package passport. Let's take a brief introduction into how they work.

Table of Contents

  • JSON Web Token
  • Passport
  • Setting up the Project
  • Setting up the database
  • Registration and log-in middleware
  • Creating the routes
  • Signing the JWT
  • Verifying the user token
  • Creating secure routes
  • Testing with Postman
  • Conclusion
JSON Web Token (JWT's)

According to JWT.IO

JSON Web Token (JWT) is an open standard (RFC 7519) that defines a compact and self-contained way for securely transmitting information between parties as a JSON object.

Now JWT's are secure because they are digitally signed and if the information contained within is tampered in any way, it renders that token invalid. We'll look at how this is made possible later on. JWT'S consist of three parts seperated by dots.

Header : this contains the type of algorithm used to verify the token and the type of token e.g

     {
        "type" : "JWT",
        "alg" : "HS256"
      }

*Payload *: contains the claims. Claims are information about the user together with other additional metadata e.g

     {
        id : 1
        name : 'devgson'
        iat : 1421211952
      }

Note : iat is a metadata indicating the date and time the token was signed. More information about metadata can be found here.

Signature : The signature encodes the information in the header and payload in base64 format together with a secret key. All this information is then signed by the algorithm specified in the header eg HMACSHA256. The signature verifies that the message being sent wasn't tampered along the way.

     HMACSHA256(
        base64UrlEncode(header) + "." +
        base64UrlEncode(payload),
        secret
      )

Note : JWT's should not be used to transfer/store secure information cause anyone that manages to intercept the token can easily decode the header and payload within, it's just encoded inbase64 format after all. All the signature does is verify that the token hasn't been tampered in any way. It doesn't stop the token from being tampered with. However, there are extra security measures that can be put in place to achieve a higher level of security. For a broad and in-depth explanation of JWT's, read this book

Passport

Passport is an authentication middleware, it is used to authenticate requests, It makes use of strategies eg Local strategy or with the rise of social networking, single sign-on using an OAuth provider such as facebook or twitter. Applications can choose which strategies they want to employ and there are individual packages for each strategy.We'll be using the local(email/password) strategy in this tutorial. More info about passport and it's available strategies could be found here. Now let's get started.

Setting up the Project

Lets create a folder structure for the files we'll be using :

    -model
    ---model.js
    -routes
    ---routes.js
    ---secure-routes.js
    -auth
    ---auth.js
    -app.js
    -package.json

Install the necessary packages

$ npm install --save bcrypt body-parser express jsonwebtoken mongoose passport passport-local passport-jwt

bcrypt : for hashing user passwords, jsonwebtoken : for signing tokens, passport-local : package for implementing local strategy, passport-jwt : middleware for getting and verifying JWT's. Here's how our application is going to work :

  • The user signs up and then logs in, after the user logs in, a JSON web token would be given to the user.
  • The user is expected to store this token locally.
  • This token is to be sent by the user when trying to access certain secure routes, once the token has been verified, the user is then allowed to access the route.Now Let's get to coding.
Setting up the database

First of all, let's create the user schema. A user should only provide email and password, that would be enough information.

model/model.js

const mongoose = require('mongoose')
const bcrypt = require('bcrypt');
const Schema = mongoose.Schema;

const UserSchema = new Schema({
  email : {
    type : String,
    required : true,
    unique : true
  },
  password : {
    type : String,
    required : true 
  }
});
  ...

Now we don't want to store passwords in plain text because if an attacker manages to get access to the database, the password can be easily read so we want to avoid this. We'll make use of a package called 'bcrypt' to hash user passwords and store them safely.

model/model.js
 ....
//This is called a pre-hook, before the user information is saved in the database
//this function will be called, we'll get the plain text password, hash it and store it.
UserSchema.pre('save', async function(next){
  //'this' refers to the current document about to be saved
  const user = this;
  //Hash the password with a salt round of 10, the higher the rounds the more secure, but the slower
  //your application becomes.
  const hash = await bcrypt.hash(this.password, 10);
  //Replace the plain text password with the hash and then store it
  this.password = hash;
  //Indicates we're done and moves on to the next middleware
  next();
});

//We'll use this later on to make sure that the user trying to log in has the correct credentials
UserSchema.methods.isValidPassword = async function(password){
  const user = this;
  //Hashes the password sent by the user for login and checks if the hashed password stored in the 
  //database matches the one sent. Returns true if it does else false.
  const compare = await bcrypt.compare(password, user.password);
  return compare;
}

const UserModel = mongoose.model('user',UserSchema);

module.exports = UserModel;
Registration and log-in middleware

We'll use the passport local strategy to create middleware that will handle user registration and login. This will then be plugged into certain routes and be used for authentication.

auth/auth.js

const passport = require('passport');
const localStrategy = require('passport-local').Strategy;
const UserModel = require('../model/model');

//Create a passport middleware to handle user registration
passport.use('signup', new localStrategy({
  usernameField : 'email',
  passwordField : 'password'
}, async (email, password, done) => {
    try {
      //Save the information provided by the user to the the database
      const user = await UserModel.create({ email, password });
      //Send the user information to the next middleware
      return done(null, user);
    } catch (error) {
      done(error);
    }
}));

//Create a passport middleware to handle User login
passport.use('login', new localStrategy({
  usernameField : 'email',
  passwordField : 'password'
}, async (email, password, done) => {
  try {
    //Find the user associated with the email provided by the user
    const user = await UserModel.findOne({ email });
    if( !user ){
      //If the user isn't found in the database, return a message
      return done(null, false, { message : 'User not found'});
    }
    //Validate password and make sure it matches with the corresponding hash stored in the database
    //If the passwords match, it returns a value of true.
    const validate = await user.isValidPassword(password);
    if( !validate ){
      return done(null, false, { message : 'Wrong Password'});
    }
    //Send the user information to the next middleware
    return done(null, user, { message : 'Logged in Successfully'});
  } catch (error) {
    return done(error);
  }
}));
    ....
Creating the routes

Now that we have middleware for handling registration and login, let's create routes that'll use this middleware.

routes/routes.js
const express = require('express');
const passport = require('passport');
const jwt = require('jsonwebtoken');

const router = express.Router();

//When the user sends a post request to this route, passport authenticates the user based on the
//middleware created previously
router.post('/signup', passport.authenticate('signup', { session : false }) , async (req, res, next) => {
  res.json({ 
    message : 'Signup successful',
    user : req.user 
  });
});

    ...
Signing the JWT

When the user logs in, the user information is passed to our custom callback which in turn creates a secure token with the information. This token is then required to be passed along as a query parameter when accessing secure routes(which we'll create later).

routes/routes.js
    ....
router.post('/login', async (req, res, next) => {
  passport.authenticate('login', async (err, user, info) => {     try {
      if(err || !user){
        const error = new Error('An Error occured')
        return next(error);
      }
      req.login(user, { session : false }, async (error) => {
        if( error ) return next(error)
        //We don't want to store the sensitive information such as the
        //user password in the token so we pick only the email and id
        const body = { _id : user._id, email : user.email };
        //Sign the JWT token and populate the payload with the user email and id
        const token = jwt.sign({ user : body },'top_secret');
        //Send back the token to the user
        return res.json({ token });
      });     } catch (error) {
      return next(error);
    }
  })(req, res, next);
});

module.exports = router;

Note : We set { session : false } because we don't want to store the user details in a session. We expect the user to send the token on each request to the secure routes. This is especially useful for API's, it can be used to track users, block , etc... but if you plan on using sessions together with JWT's to secure a web application, that may not be a really good idea performance wise, more details about this here.

Verifying the user token

So now we've handled user signup and login, The next step is allowing users with tokens access certain secure routes, but how do we verify that the token sent by the user is valid and hasn't been manipulated in some way or just outright invalid. Let's do that next.

auth/auth.js
  ....
const JWTstrategy = require('passport-jwt').Strategy;
//We use this to extract the JWT sent by the user
const ExtractJWT = require('passport-jwt').ExtractJwt;

//This verifies that the token sent by the user is valid
passport.use(new JWTstrategy({
  //secret we used to sign our JWT
  secretOrKey : 'top_secret',
  //we expect the user to send the token as a query paramater with the name 'secret_token'
  jwtFromRequest : ExtractJWT.fromUrlQueryParameter('secret_token')
}, async (token, done) => {
  try {
    //Pass the user details to the next middleware
    return done(null, token.user);
  } catch (error) {
    done(error);
  }
}));

Note : If you'll need extra or sensitive details about the user that are not available in the token, you could use the _id available on the token to retrieve them from the database.

Creating secure routes

Now lets create some secure routes that only users with verified tokens can accces.

routes/secure-routes.js
const express = require('express');

const router = express.Router();

//Lets say the route below is very sensitive and we want only authorized users to have access

//Displays information tailored according to the logged in user
router.get('/profile', (req, res, next) => {
  //We'll just send back the user details and the token
  res.json({
    message : 'You made it to the secure route',
    user : req.user,
    token : req.query.secret_token
  })
});

module.exports = router;

So now we're all done with creating the routes and authentication middleware, let's put everything together and then test it out.

app.js

const express = require('express');
const mongoose = require('mongoose');
const bodyParser = require('body-parser');
const passport = require('passport');
const app = express();
const UserModel = require('./model/model');

mongoose.connect('mongodb://127.0.0.1:27017/passport-jwt', { useMongoClient : true });
mongoose.connection.on('error', error => console.log(error) );
mongoose.Promise = global.Promise;

require('./auth/auth');

app.use( bodyParser.urlencoded({ extended : false }) );

const routes = require('./routes/routes');
const secureRoute = require('./routes/secure-route');

app.use('/', routes);
//We plugin our jwt strategy as a middleware so only verified users can access this route
app.use('/user', passport.authenticate('jwt', { session : false }), secureRoute );

//Handle errors
app.use(function(err, req, res, next) {
  res.status(err.status || 500);
  res.json({ error : err });
});

app.listen(3000, () => {
  console.log('Server started')
});
Testing with Postman

Now that we've put everything together, let's use postman to test our API authentication. First of all we'll have to signup with an email and password. We can send over these details through the Body of our request. When that's done, click the send button to initiate the POST request.

We can see the password is encrypted, therefore anyone with access to the database will have access to only the hashed password, we added ten(10) salt rounds to increase the security. You can read more about this here. Let's now login with the credentials and get our token. Visit the /login route, passing the email and password you used previously and then initiate the request.

Now we have our token, we'll send over this token whenever we want to access a secure route. Let's try this by accessing a secure route user/profile, we'll pass our token in a query parameter called secret_token, The token will be collected, verified and we'll be given access to the route if it's valid.

As you can see, the valid token enables us gain access to the secure route. You could go ahead and try accessing this route but with an invalid token, the request will return an Unauthorized error.

Conclusion

JSON web tokens provide a secure way for creating authentication. An extra layer of security can be added by encrypting all the information within the token, thereby making it even much more secure.

Creating a RESTful Web API with Node.js and Express.js from scratch

Creating a RESTful Web API with Node.js and Express.js from scratch

In this article, I’ll show you step by step how to create a RESTful Web API with Node.js and Express.js by building a simple and useful Todo API. This article assumes you have basic javascript knowledge and terminal using capabilities.

In this article, I’ll show you step by step how to create a RESTful Web API with Node.js and Express.js by building a simple and useful Todo API. This article assumes you have basic javascript knowledge and terminal using capabilities.

You can also build a Web API in Node.js by using another framework except Express.js but Express.js is one of the most popular web framework for Node.js.

You can found the final source code of this Web API in this github repository.

Let’s start to create our mentioned Web API.

Before start

If you have never used Node.js or npm package manager you should install them.

To check whether the Node.js is already installed on your computer, open your terminal and run node -v command. If you see your Node.js version it's installed. Otherwise go to below link.

Click here to download and install Node.js (You can choose LTS version)

And if you don’t have any IDE or text editor for writing javascript I advice you Visual Studio Code.

Click here to download VS Code (Optional)

About express-generator

In fact we could use <a href="https://expressjs.com/en/starter/generator.html" target="_blank">express-generator</a> tool which designed to creating an Express Web API quickly but I want to create this API from scratch because of that tool puts some extra files and folder structures that we don't need them now. But you can use this useful tool next time on creating new Web API. I won't use it now due to keep article simple.

Creating Project

Go to your workspace root folder and create a new folder there named "todo-api".

Then create "package.json" and "server.js" files into "todo-api" folder like below.

package.json

{
    "name": "todo-api",
    "version": "1.0.0",
    "scripts": {
        "start": "node server.js"
    },
    "dependencies": {
        "express": "^4.16.4"
    }
}

server.js

const http = require('http');
const express = require('express');
const app = express();
app.use(express.json());
app.use('/', function(req, res) {
    res.send('todo api works');
});
const server = http.createServer(app);
const port = 3000;
server.listen(port);
console.debug('Server listening on port ' + port);

After creating above files open your terminal in the "todo-api" folder and run npm installcommand.

This command will be install your project dependencies which pointed at the "package.json" file.

After finished package download process, downloaded dependency files will be installed into"node_modules" folder at the root of the "todo-api" folder.

After finished package installing then run npm start to start our Web API.

Now our Web API listening. To see result open your web browser then write localhost:3000 to address bar and press enter.

As result you’ll see our request handler response in your browser: “todo api works”.

This is a dead simple Express.js Web API. And it needs the some development. For example we need to an api endpoint to get todo items. So let’s add a new API endpoint for this.

Create a new folder named "routes" in the root of the "todo-api" folder.

Then create a "items.js" file inside of "routes" folder and put following codes inside it.

Your final folder structure should be like below;

/todo-api
/node_modules
/routes
    items.js
package.json
server.js

items.js

const express = require('express');
const router = express.Router();
const data = [
    {id: 1, title: 'Finalize project', order: 1, completed: false, createdOn: new Date()},
    {id: 2, title: 'Book ticket to London', order: 2, completed: false, createdOn: new Date()},
    {id: 3, title: 'Finish last article', order: 3, completed: false, createdOn: new Date()},
    {id: 4, title: 'Get a new t-shirt', order: 4, completed: false, createdOn: new Date()},
    {id: 5, title: 'Create dinner reservation', order: 5, completed: false, createdOn: new Date()},
];
router.get('/', function (req, res) {
    res.status(200).json(data);
});
router.get('/:id', function (req, res) {
    let found = data.find(function (item) {
        return item.id === parseInt(req.params.id);
    });
    if (found) {
        res.status(200).json(found);
    } else {
        res.sendStatus(404);
    }
});
module.exports = router;

Initial code of "items.js" file contains two endpoints. First one gets all todo items and second one gets one item which matches given id parameter.

Before testing items routes we should register it in the "server.js" file.

Modify "server.js" file like below to register new item routes.

server.js

const http = require('http');
const express = require('express');
const itemsRouter = require('./routes/items');
const app = express();
app.use(express.json());
app.use('/items', itemsRouter);
app.use('/', function(req, res) {
    res.send('todo api works');
});
const server = http.createServer(app);
const port = 3000;
server.listen(port);
console.debug('Server listening on port ' + port);

Now run npm start to start our Web API.

Then open your web browser and write localhost:3000/items to address bar and press enter.

You’ll see todo items json array in the response body.

And write localhost:3000/items/3 to address bar and press enter.

You’ll see the todo item which has id 3 in the response body.

But not finished up yet.

CRUD Operations and HTTP methods

I think we’ll need CRUD operations to Create, Read, Update and Delete todo items.

We have already two endpoints for getting items. So we need Create, Update and Delete endpoints.

Let’s add also these endpoints into the items.js file.

Our final "items.js" file and endpoints should be like below.

const express = require('express');
const router = express.Router();

const data = [
  {id: 1, title: 'Finalize project',          order: 1, completed: false, createdOn: new Date()},
  {id: 2, title: 'Book ticket to London',     order: 2, completed: false, createdOn: new Date()},
  {id: 3, title: 'Finish last article',       order: 3, completed: false, createdOn: new Date()},
  {id: 4, title: 'Get a new t-shirt',         order: 4, completed: false, createdOn: new Date()},
  {id: 5, title: 'Create dinner reservation', order: 5, completed: false, createdOn: new Date()},
];

router.get('/', function (req, res) {
  res.status(200).json(data);
});

router.get('/:id', function (req, res) {
  let found = data.find(function (item) {
    return item.id === parseInt(req.params.id);
  });

  if (found) {
    res.status(200).json(found);
  } else {
    res.sendStatus(404);
  }
});

router.post('/', function (req, res) {
  let itemIds = data.map(item => item.id);
  let orderNums = data.map(item => item.order);

  let newId = itemIds.length > 0 ? Math.max.apply(Math, itemIds) + 1 : 1;
  let newOrderNum = orderNums.length > 0 ? Math.max.apply(Math, orderNums) + 1 : 1;

  let newItem = {
    id: newId,
    title: req.body.title,
    order: newOrderNum,
    completed: false,
    createdOn: new Date()
  };

  data.push(newItem);

  res.status(201).json(newItem);
});

router.put('/:id', function (req, res) {
  let found = data.find(function (item) {
    return item.id === parseInt(req.params.id);
  });

  if (found) {
    let updated = {
      id: found.id,
      title: req.body.title,
      order: req.body.order,
      completed: req.body.completed
    };

    let targetIndex = data.indexOf(found);

    data.splice(targetIndex, 1, updated);

    res.sendStatus(204);
  } else {
    res.sendStatus(404);
  }
});

router.delete('/:id', function (req, res) {
  let found = data.find(function (item) {
    return item.id === parseInt(req.params.id);
  });

  if (found) {
    let targetIndex = data.indexOf(found);

    data.splice(targetIndex, 1);
  }

  res.sendStatus(204);
});

module.exports = router;

Short Explanation

I wanna explain shortly some points of our last codes.

First of all you must have noticed that our api works on a static data and keeps it on memory. All of our GET, POST, PUT and DELETE http methods just manipulate a json array. The purpose of this is to keep article simple and draw attention to the Web API structure.

Due to this situation our POST method has some extra logic such as calculating next item ids and order numbers.

So you can modify logic and data structures in these http methods to use a database or whatever you want.

Testing API with Postman

We have tested the GET methods of our Web API in our web browser and seen responses. But we can’t test directly POST, PUT and DELETE http methods in web browser.

If you want to test also other http methods you should use Postman or another http utility.

Now I’ll show you how to test the Web API with Postman

Before we start click here and install Postman.

When you first launch Postman after installing you’ll see start window. Close this start window by clicking close button on top right corner. Then you must see following screen.

An empty Postman request

Sending GET Request

Before sending a request to API we should start it by running npm startcommand as we do before.

After start the Web API and seeing “Server listening on…” message write localhost:3000/itemsto address bar as seen below and click Send button. You'll see todo items array as API response like below.

Sending a GET request with Postman

You can try similarly by giving an item id in request url like this localhost:3000/items/3

Sending POST Request

To sending a POST request and create a new todo item write localhost:3000/items to address bar and change HTTP verb to POST by clicking arrow at front of the address bar as seen below.

Sending a POST request with Postman

Before sending the POST request you should add request data to body of the request by clicking body tab and selecting raw and JSON as seen below.

Attaching a JSON body to POST request in Postman

Now click Send button to send POST request to the Web API. Then you must get “201 Created” http response code and seeing created item in the response body.

To see the last status of todo items send a get request to localhost:3000/itemsaddress. You must see newly created item at the end of the list.

Sending PUT Request

Sending PUT request is very similar to sending POST request.

The most obvious difference is request url should be pointed specific item like this localhost:3000/items/3

And you should choose PUT as http verb instead of POST and send all of the required data in the request body unlike POST.

For example you could send a JSON body in the PUT request as below.

An example JSON body for PUT request

{
    "title": "New title of todo item",
    "order": 3,
    "completed": false
}

When you click Send button you must get “204 No Content” http response code. You can check item you updated by sending a get request.

Sending DELETE Request

To send a DELETE request, change the request url to address a specific item id like this localhost:3000/items/3

And select DELETE as http verb and click Send button.

You must get “204 No Content” http response code as result of the DELETE operation.

Send a get request and see the last status of list.

About the DELETE Http Request

I want to say a few words about DELETE http request. You must have noticed something in our delete code. DELETE request returns “204 No Content” every situation.

Http DELETE requests are idempotent. So what that mean? If you delete a resource on server by sending DELETE request, it’s removed from the collection. And every next DELETE request on the same resource won’t change outcome. So you won’t get “404 Not Found” in the second request. Each request returns same response whether succeed or not. That’s mean idempotent operation.

Conclusion

Finally we’ve tested all http methods of our Web API.

As you can see, it works just fine.

Thanks for reading ❤

If you liked this post, share it with all of your programming buddies!

Developing Modern APIs with Hapi.js, Node.js, and Redis

Developing Modern APIs with Hapi.js, Node.js, and Redis

Learn how to develop odern backend APIs with Hapi.js, Node.js, and Redis.

In this article, you are going to learn how to develop modern APIs with Hapi.js and Node.js, while using Redis as the persistence layer. As it is not possible to release an API without a security layer, you will also learn how to secure your application with Auth0. If needed, you can find the final code developed throughout this article in this GitHub repository.

What is Hapi.js?

Hapi.js is a framework for creating backend APIs. What is nice about Hapi.js, when compared to other solutions like Express, is the coding-by-configuration architecture. As you will see, most of the "coding" is actually done by tweaking the vast configuration interface that Hapi.js provides to developers. This approach helps to split the common aspects of HTTP from the handler.

What Is Redis and What Will You Build?

Redis is an open-source, in-memory data store that provides an interface so applications can manipulate data based on a key-value approach. As everything in a Redis database is simply a value accessible through a key, fetching data from it is extremely fast. This characteristic of Redis makes this database perfect for applications like to-do lists.

So, in this article, you will use Redis to act as the persistence layer of a backend API that supports a to-do list application. You won't develop the frontend application in this article, but you will soon, on an upcoming one.

Note: In this article, you are going to use Hapi.js 17. This version has breaking changes from version 16.

What Is Docker and Why Do You Care?

To keep your machine clean, you are not going to install Redis directly on your operating system. Instead, you are going to run Redis inside a Docker container. Docker, if you don't know, is a solution that enables users to run programs that operate completely isolated from each other. Docker achieves this by containerizing these programs into engines that work similarly to virtual machines.

However, containers are way less expensive (i.e., more lightweight) when compared to traditional virtual machines. For example, you can easily bootstrap a container that uses NGINX in front of a Node.js instance to serve a web app with 16MB or less. Also, Docker uses a file called dockerfile that facilitates the process of sharing containers configuration with others.

In this article, you are going to download and use a pre-built Redis container that allows you to use Redis fresh out of the box, with no setup.

Bootstrapping a Hapi.js API

Your API will contain the main server setup and individual files for each route you will need to define. Basically, you will create a project that contains the following structure:

  • src/: A directory that will hold code related to the server setup.
  • src/routes: A directory where you will define the endpoints of your API.

So, open a terminal, locate the directory where you want to store your project in, and run the following commands:

# e.g., move to your home dir (or anywhere else)
cd ~

# create a directory for your project
mkdir nodejs-hapijs-redis

# move into it
cd nodejs-hapijs-redis

# and create both subdirectories
mkdir -p src/routes

After that, you can initialize your main directory as an NPM project and install some dependencies on it:

# initialize this directory as an NPM project
npm init -y

# install your project's dependencies
npm install --save boom good good-console good-squeeze hapi hapi-auth-jwt2 hapi-require-https inert joi jwks-rsa lout node-env-file redis uuid vision

As you can see, you will need to install a considerable number of dependencies. Throughout this article, you will see how each one fits in. However, the following list gives a brief introduction to them:

  • boom: This is a library that tightly integrates with Hapi.js to throw HTTP-friendly error objects.
  • good: This is a library that you will plug into Hapi.js to monitor and report on a variety of server events.
  • good-console: This library is useful for turning good server events into formatted strings.
  • good-squeeze: This library is useful for filtering events based on the good event options.
  • hapi: This is the main package of Hapi.js itself.
  • hapi-auth-jwt2: This is an authentication scheme/plugin for Hapi.js apps using JSON Web Tokens.
  • hapi-require-https: This is a library that will help you force secure connections (i.e., HTTPS).
  • inert: This is a library that helps you serve static file and directory handlers in your Hapi.js API.
  • joi: This library introduces an object schema description language and a validator for JavaScript objects.
  • jwks-rsa: This library retrieves RSA public keys from a JWKS (JSON Web Key Set) endpoint.
  • lout: This library helps you create the API documentation for your Hapi.js backend.
  • node-env-file: This library parses and loads environment files into a Node.js environment (i.e., into the process.env object).
  • redis: This is a Redis client for Node.js applications.
  • uuid: This library generates RFC-compliant UUIDs in JavaScript.
  • vision: This library enables templates rendering for Hapi.js.

Now that you know what you just installed, open the package.json file that NPM created for you and replace its scripts property with this:

"scripts": {
  "start": "node index.js"
}

Note: You might also want to start Git (or any other version control system) now and start committing your work. It's always a good idea to use tools like Git to manage your source code.

Initializing Redis with Docker

As mentioned, you will bootstrap a Redis instance in your local machine with the help of Docker. Therefore, before proceeding you will have to install Docker locally. After installing it, you can test the installation by running the following command:

docker --version

If everything goes fine, you can issue this command to run Redis locally (in a Docker container, of course):

docker run --name nodejs-hapijs-redis \
    -p 6379:6379 \
    -d redis

If this is the first time you are running Redis locally with the help of Docker, this command will output Unable to find image 'redis:latest' locally in your terminal and will start downloading a Redis image from Docker Hub. For this article, you don't need to learn how Docker works. Issuing the command above suffices for you to move along. However, after you finish with this article, make sure you learn more about Docker. The tool is amazing.

Signing Up to Auth0

To start with a secure backend from scratch, you will sign up for a free Auth0 account now (i.e., if you don't have one yet) and you will configure your project to use this identity provider.

If you don't know, Auth0 is a global leader in Identity-as-a-Service (IDaaS) that provides thousands of enterprise customers with modern identity solutions. Alongside with the classic username and password authentication process, Auth0 allows you to add features like Social Login, Multi-factor Authentication, and much more with just a few clicks.

So, after you sign up for Auth0, you can head to the APIs section of your dashboard and click on Create API. Then, on the dialog that Auth0 shows, you will have to provide a Name for your API (e.g., "Hapi.js Tutorial") and an Identifier (e.g., http://localhost:3000). The name of your API is just a label so you can easily remember what the API is about. The identifier is a string that you will use while configuring your backend. This identifier doesn't really have to be an URL, as Auth0 won't call it in any moment, but it's advised to use one.

After filling out the form, click on Create so Auth0 finishes the creation for you.

Creating an environment file

As you will have the configuration for your Auth0 account, you will keep it in a separate file so you can easily switch between a production and testing environment. As such, create a file called .env in your project root and put the following contents in it:

AUTH0_AUDIENCE=http://localhost:3000
AUTH0_DOMAIN=<YOUR_AUTH0_DOMAIN>
HOST=localhost
PORT=3000
REDIS_HOST=localhost
REDIS_PORT=6379
SSL=false

Replace <YOUR_AUTH0_DOMAIN> with the domain you chose while creating your Auth0 account (e.g., blog-samples.auth0.com). The other configuration variables will work in your local environment, unless you chose another identifier for your API. If that is the case, you will have to set the correct value to the AUTH0_AUDIENCE variable.

Note: The SSL variable above defines if your API will accept only requests through a secure channel (i.e., HTTPS) or not. This variable will be used by the hapi-require-https library that you installed before.

Creating the Hapi.js Server

With the environment variables properly defined, you will have to create a script to start your Hapi.js server. To do so, create a file called index.js in the project root (i.e., in the nodejs-hapijs-redis directory) and add the following code into it:

require('node-env-file')(`${__dirname}/.env`);

const redis = require('redis');
const createServer = require('./src/server');
const {promisify} = require('util');

const start = async () => {
  const server = await createServer(
    {
      port: process.env.PORT,
      host: process.env.HOST,
    },
    {
      enableSSL: process.env.SSL === 'true',
    }
  );

  const redisClient = redis.createClient(
    {
      host: process.env.REDIS_HOST,
      port: process.env.REDIS_PORT,
    }
  );

  redisClient.lpushAsync = promisify(redisClient.lpush).bind(redisClient);
  redisClient.lrangeAsync = promisify(redisClient.lrange).bind(redisClient);
  redisClient.llenAsync = promisify(redisClient.llen).bind(redisClient);
  redisClient.lremAsync = promisify(redisClient.lrem).bind(redisClient);
  redisClient.lsetAsync = promisify(redisClient.lset).bind(redisClient);

  redisClient.on("error", function (err) {
    console.error("Redis error.", err);
  });

  server.app.redis = redisClient;

  await server.start();

  console.log(`Server running at: ${server.info.uri}`);
  console.log(`Server docs running at: ${server.info.uri}/docs`);
};

process.on('unhandledRejection', (err) => {
  console.error(err);
  process.exit(1);
});

start();

As you can see, the first thing your script does is to load the environment variables you just defined. Then, it uses a function called createServer to, well, create a server. After that, the script creates a client to Redis and uses the promisify function provided by Node.js to make the functions provided by the client return JavaScript Promises (using promises, and the new async/await syntax, will make your life much easier). Also, you bind the Redis object to server.app.redis so you have access to it in the routes to store and retrieve data.

Perhaps you didn't realize (or perhaps you did), but the createServer function used in the script above doesn't exist yet. This function, as stated on line #4, is expected to be defined on a module called server in the src directory.

Therefore, you can create the src/server.js file and add the following code to it:

const Hapi = require('hapi');
const jwksRsa = require('jwks-rsa');

const validateFunc = async (decoded) => {
  return {
    isValid: true,
    credentials: decoded,
  };
};

module.exports = async (serverOptions, options) => {
  const server = Hapi.server(
    Object.assign({
      port: 3001,
      host: 'localhost',
      routes: {
        cors: {
          origin: ['*'],
        },
      },
    }, serverOptions),
  );

  // Redirect to SSL
  if (options.enableSSL) {
    console.log('Setting SSL');
    await server.register({plugin: require('hapi-require-https')});
  } else {
    console.log('Not setting SSL');
  }

  await server.register([
    require('vision'),
    require('inert'),
    {
      plugin: require('lout'),
      options: {
        endpoint: '/docs',
      },
    },
    {
      plugin: require('good'),
      options: {
        ops: {
          interval: 1000,
        },
        reporters: {
          consoleReporter: [
            {
              module: 'good-squeeze',
              name: 'Squeeze',
              args: [{response: '*'}],
            },
            {
              module: 'good-console',
            },
            'stdout',
          ],
        },
      },
    },
  ]);

  await server.register(require('hapi-auth-jwt2'));

  server.auth.strategy('jwt', 'jwt', {
    complete: true,
    key: jwksRsa.hapiJwt2KeyAsync({
      cache: true,
      rateLimit: true,
      jwksRequestsPerMinute: 5,
      jwksUri: `https://${process.env.AUTH0_DOMAIN}/.well-known/jwks.json`,
    }),
    verifyOptions: {
      audience: process.env.AUTH0_AUDIENCE,
      issuer: `https://${process.env.AUTH0_DOMAIN}/`,
      algorithms: ['RS256'],
    },
    validate: validateFunc,
  });

  server.auth.default('jwt');

  server.route(require('./routes.js'));

  return server;
};

The main export from this code is a function that creates and returns a valid Hapi.js server. This function starts by accepting arguments from the index.js file and by creating the server. Then, it provide some default configurations like port and host to make sure that everything goes fine if the caller doesn't specify these variables, but soon it replaces them with the ones provided by the caller (if any).

After creating the Hapi.js server (Hapi.server()), this script decides, based on the configuration passed, if it is going to use SSL or not. Then, the script configures the plugins you installed before (e.g., vision, inert, and lout) in your Hapi.js server.

Finally, the script secures the server by using the jwt strategy (server.auth.strategy('jwt', ...)) and by making it the default authentication method (server.auth.default('jwt')).

The function validateFunc (defined at the top of the script) is given users' credentials and returns an object telling Hapi.js whether these users have access to the current resource or not. In this simple example, you allow all users access if they have a valid token, but you can be more restrictive by refactoring this function.

The last thing this script does, besides returning an instance of the Hapi.js file, is to define that it will load the endpoint (also known as routes) from a module called routes. You will define this module in the next section.

Defining Routes on Hapi.js

Now, it is time to learn how to define endpoints (i.e., routes) in your Hapi.js server. In the server module, you called the server.route function, which accepts an array of routes for your server. As such, you could simply define these routes directly into the server module. However, to make the code more readable and organized, you will put each route in a different file.

To do so, create a file called src/routes.js and copy the following into it:

module.exports = [

  './routes/todo_get',
  './routes/todo_post',
  './routes/todo_delete',

].map((elem) => require(elem));

This code maps over each filename and returns an array of imported routes. As you can imagine, you still have to define these files and routes.

Defining a Route to Post new Items

For your first route, you will create an endpoint that enables users to add new items to their to-do lists. To do so, make a file called src/routes/todo_post.js with the following contents:

const Joi = require('joi');
const Boom = require('boom');

module.exports = {
  method: 'POST',
  path: '/todo',
  options: {
    auth: 'jwt',
    validate: {
      payload: {
        item: Joi.string().required().notes('Text to store in list')
      },
    },
    description: 'Add item',
    notes: 'Add an item to the list',
    tags: ['api'],
  },
  handler: async (request, h) => {
    let {sub: redispath} = request.auth.credentials;
    let {item: redisvalue} = request.payload;
    let {redis} = request.server.app;

    try {

      let count = await redis.lpushAsync(redispath, redisvalue);

      return h.response({
        count
      }).code(201);

    } catch (e) {
      return Boom.badImplementation(e);
    }
  }
};

The export from this file is a JSON object that represents a route for Hapi.js. The method and path properties tell Hapi.js what HTTP method and what route is required to call the handler code. In the options, you specify jwt as the authentication required to access this route. The description, notes, and tags document the route for others using it.

The validate object is an extremely useful courtesy of the joi library. This allows you to specify what inputs are required for the route and, if not met, Hapi.js will automatically throw an error for you. All that is required for this route is an item that comes as the payload of requests. This item must be a string and is required (string().required()).

Finally, the handler runs your route and returns a value to Hapi.js. You use the JWT subject as the key for the Redis key-value pair, and the value of this key is the string sent by the user. You use the new promisified Redis functions to add the item to Redis, and you return the number of items in the array (with a 201 response code).

If anything goes wrong, your Hapi.js server will send an HTTP error code back using the Boom library.

Defining a Route to Delete Items

To allow users to delete items, create a file called src/routes/todo_delete.js with the following contents:

const Joi = require('joi');
const Boom = require('boom');

module.exports = {
  method: 'DELETE',
  path: '/todo',
  options: {
    auth: 'jwt',
    validate: {
      payload: {
        index: Joi.number().min(0).required().notes('Index to delete'),
      },
    },
    description: 'Delete item',
    notes: 'Delete an item from the todo list',
    tags: ['api'],
  },
  handler: async (request, h) => {
    let {sub: redispath} = request.auth.credentials;
    let {index: redisindex} = request.payload;
    let {redis} = request.server.app;

    try {
      await redis.lsetAsync(redispath, redisindex, '__DELETE__');
      await redis.lremAsync(redispath, 1, '__DELETE__');

      return h.response({}).code(200);
    } catch (e) {
      return Boom.badImplementation(e);
    }
  }
};

The route is very similar to the POST route. You define the endpoint as an HTTP DELETE route with a required index parameter to delete a value from Redis. To delete the item from Redis by index, you first overwrite the value of that entry, and then delete entries with that new value.

What is Hypertext Application Language (HAL)?

When you define your final route for retrieving the todo items, you will borrow some features from the HAL specification. This spec is designed to make it easy to traverse APIs without having to guess endpoints.

For your case, you will page the results when retrieving items, so you will include a link to the next page of results in the response. This way, the client applications that use your API won't have to generate the links themselves.

Defining a Route to Get All Items

Finally, to define an endpoint where users will be able to get all their to-do items, create a file called src/routes/todo_get.js with the following contents:

const Joi = require('joi');
const Boom = require('boom');

module.exports = {
  method: 'GET',
  path: '/todo',
  options: {
    auth: 'jwt',
    validate: {
      query: {
        start: Joi.number().min(0).default(0).notes('Start index of results inclusive'),
        results: Joi.number().min(1).max(100).default(10).notes('Number of results to return'),
      },
    },
    description: 'Get items',
    notes: 'Get items from todo list paged',
    tags: ['api'],
  },
  handler: async (request, h) => {
    let {redis} = request.server.app;
    let {sub: redispath} = request.auth.credentials;
    let {start, results} = request.query;

    try {
      let value = await redis.lrangeAsync(redispath, start, start + (results - 1));
      let count = await redis.llenAsync(redispath);

      if (!value) value = [];

      return h.response({
        nextlink: `${request.url.pathname}?start=${start + results}&results=${results}`,
        value,
        count
      });
    } catch (e) {
      return Boom.badImplementation(e);
    }
  }
};

This module (or file) defines a GET HTTP route with two optional query string parameters (with default values set). By using these parameters, your client can specify the first element (start index) and the number of results they need. Note that this script gets the results from Redis and also the total number of results. This information is important so the client can display how many items the user has.

In the response, you add a nextlink property with the API URL to call for the next set of results.

Running and Using your Hapi.js API

That's it! You just finished creating your Node.js backend API with the help of Hapi.js and Redis. With all these files in place, you can take your API for a spin. To do so, issue the following command on the terminal (just make sure you are in the correct directory: nodejs-hapijs-redis):

npm start

Then, if you go to the /docs resource, you will see the documentation of your Hapi.js API:

Now, to test if your endpoints are really secured, you can issue the following curl commands:

curl http://localhost:3000/todo

curl -X POST -H 'Content-Type: application/json' -d '{
  "item": "It should not work."
}' http://localhost:3000/todo

Both commands above should return the following response:

{
  "statusCode": 401,
  "error": "Unauthorized",
  "message": "Missing authentication"
}

That is, your server is telling you that it is expecting you to be authenticated somehow. The server doesn't specify that it's expecting an access token from Auth0 because you shouldn't be adding details like that about your services. However, you know that this is what you need.

So, there are multiple ways to fetch a token from Auth0. The strategy that you will use will depend on what context you are in. For example, if you are on a Single Page Application (SPA), you will use what is called the Implicit Grant. If you are on a native, mobile application, you will use the Authorization Code Grant Flow with PKCE. However, for a simple test like this one, you can use your Auth0 dashboard to get one.

So, head back to the APIs section in your Auth0 dashboard, click on the API you created before, and then click on the Test section of this API. There, you will find a button called Copy Token. Click on this button to copy an access token to your clipboard.

Then, with this token in your clipboard, go back to your terminal and execute the following commands:

# set a variable with your access token
ACCESS_TOKEN=<YOUR_ACCESS_TOKEN>

# use the token to insert an item
curl -X POST -H 'Content-Type: application/json' \
  -H 'Authorization: Bearer '$ACCESS_TOKEN -d '{
  "item": "Learn about more about Docker, Auth0, and Redis."
}' http://localhost:3000/todo

Note: You will have to replace <YOUR_ACCESS_TOKEN> with the token copied from Auth0.

The second command, the one that issues an HTTP request with your token, will create a new item in your to-do list so you can remember that you have to "learn about more about Docker, Auth0, and Redis." As the response to this request, your API will send this to you:

{
  "count": 1
}

This answer tells you that you have a single record on your to-do list right now, as you would expect. Now, to see this item, you can issue the following command:

# in the same terminal because you need $ACCESS_TOKEN
curl -H 'Authorization: Bearer '$ACCESS_TOKEN http://localhost:3000/todo

This command will output the following response from the Hapi.js API:

{
  "nextlink": "/todo?start=10&results=10",
  "value": ["Learn about more about Docker, Auth0, and Redis."],
  "count": 1
}

As you can see, your to-do item was properly inserted. Now, to remove this item, you can issue this command:

curl -X DELETE -H 'Content-Type: application/json' \
  -H 'Authorization: Bearer '$ACCESS_TOKEN -d '{
  "index": 0
}' http://localhost:3000/todo

In this case, you are issuing a DELETE request with index equals 0 so your API remove the first element from your to-do list. Cool, you just used your API for the first time!

Conclusion and Next Steps

In this article, you learned how to create modern APIs with Hapi.js, Node.js, and Redis. Also, you learned how to integrate your API with Auth0 to take advantage of the state-of-the-art security provided by this company. All of that, without struggling too much.

However, you wouldn't expect end-users to use a REST API directly through the command-line interface or through generic HTTP clients like Postman, would you? As such, in the next article, you will learn how to create a Single Page Application to interact with your API. To create this application, you will use a modern approach based on web components and LitElement. Stay tuned!

30s ad

Getting Started with NodeJS for Beginners

Learn Nodejs by building 12 projects

Supreme NodeJS Course - For Beginners

NodeJS & MEAN Stack - for Beginners - In Easy way!

Node.js for beginners, 10 developed projects, 100% practical

Top 7 Most Popular Node.js Frameworks You Should Know

Top 7 Most Popular Node.js Frameworks You Should Know

Node.js is an open-source, cross-platform, runtime environment that allows developers to run JavaScript outside of a browser. In this post, you'll see top 7 of the most popular Node frameworks at this point in time (ranked from high to low by GitHub stars).

Node.js is an open-source, cross-platform, runtime environment that allows developers to run JavaScript outside of a browser.

One of the main advantages of Node is that it enables developers to use JavaScript on both the front-end and the back-end of an application. This not only makes the source code of any app cleaner and more consistent, but it significantly speeds up app development too, as developers only need to use one language.

Node is fast, scalable, and easy to get started with. Its default package manager is npm, which means it also sports the largest ecosystem of open-source libraries. Node is used by companies such as NASA, Uber, Netflix, and Walmart.

But Node doesn't come alone. It comes with a plethora of frameworks. A Node framework can be pictured as the external scaffolding that you can build your app in. These frameworks are built on top of Node and extend the technology's functionality, mostly by making apps easier to prototype and develop, while also making them faster and more scalable.

Below are 7of the most popular Node frameworks at this point in time (ranked from high to low by GitHub stars).

Express

With over 43,000 GitHub stars, Express is the most popular Node framework. It brands itself as a fast, unopinionated, and minimalist framework. Express acts as middleware: it helps set up and configure routes to send and receive requests between the front-end and the database of an app.

Express provides lightweight, powerful tools for HTTP servers. It's a great framework for single-page apps, websites, hybrids, or public HTTP APIs. It supports over fourteen different template engines, so developers aren't forced into any specific ORM.

Meteor

Meteor is a full-stack JavaScript platform. It allows developers to build real-time web apps, i.e. apps where code changes are pushed to all browsers and devices in real-time. Additionally, servers send data over the wire, instead of HTML. The client renders the data.

The project has over 41,000 GitHub stars and is built to power large projects. Meteor is used by companies such as Mazda, Honeywell, Qualcomm, and IKEA. It has excellent documentation and a strong community behind it.

Koa

Koa is built by the same team that built Express. It uses ES6 methods that allow developers to work without callbacks. Developers also have more control over error-handling. Koa has no middleware within its core, which means that developers have more control over configuration, but which means that traditional Node middleware (e.g. req, res, next) won't work with Koa.

Koa already has over 26,000 GitHub stars. The Express developers built Koa because they wanted a lighter framework that was more expressive and more robust than Express. You can find out more about the differences between Koa and Express here.

Sails

Sails is a real-time, MVC framework for Node that's built on Express. It supports auto-generated REST APIs and comes with an easy WebSocket integration.

The project has over 20,000 stars on GitHub and is compatible with almost all databases (MySQL, MongoDB, PostgreSQL, Redis). It's also compatible with most front-end technologies (Angular, iOS, Android, React, and even Windows Phone).

Nest

Nest has over 15,000 GitHub stars. It uses progressive JavaScript and is built with TypeScript, which means it comes with strong typing. It combines elements of object-oriented programming, functional programming, and functional reactive programming.

Nest is packaged in such a way it serves as a complete development kit for writing enterprise-level apps. The framework uses Express, but is compatible with a wide range of other libraries.

LoopBack

LoopBack is a framework that allows developers to quickly create REST APIs. It has an easy-to-use CLI wizard and allows developers to create models either on their schema or dynamically. It also has a built-in API explorer.

LoopBack has over 12,000 GitHub stars and is used by companies such as GoDaddy, Symantec, and the Bank of America. It's compatible with many REST services and a wide variety of databases (MongoDB, Oracle, MySQL, PostgreSQL).

Hapi

Similar to Express, hapi serves data by intermediating between server-side and client-side. As such, it's can serve as a substitute for Express. Hapi allows developers to focus on writing reusable app logic in a modular and prescriptive fashion.

The project has over 11,000 GitHub stars. It has built-in support for input validation, caching, authentication, and more. Hapi was originally developed to handle all of Walmart's mobile traffic during Black Friday.