Hapi.js

Hapi.js

This topic will provide all the courses and tutorials on Hapi.js. Server Framework for Node.js. Build powerful, scalable applications, with minimal overhead and full out-of-the-box functionality.
Nandu Singh

Nandu Singh

1585725960

How to Enable Cross Origin Access Control in Hapi.JS

cors - the Cross-Origin Resource Sharing protocol allows browsers to make cross-origin API calls. CORS is required by web applications running inside a browser which are loaded from a different domain than the API server. CORS headers are disabled by default (false). To enable, set cors to true, or to an object with the following options:

origin - a strings array of allowed origin servers (‘Access-Control-Allow-Origin’). The array can contain any combination of fully qualified origins along with origin strings containing a wildcard ‘’ character, or a single ‘’ origin string. Defaults to any origin [‘*’].

maxAge - number of seconds the browser should cache the CORS response (‘Access-Control-Max-Age’). The greater the value, the longer it will take before the browser checks for changes in policy. Defaults to 86400 (one day).

headers - a strings array of allowed headers (‘Access-Control-Allow-Headers’). Defaults to [‘Accept’, ‘Authorization’, ‘Content-Type’, ‘If-None-Match’].

additionalHeaders - a strings array of additional headers to headers. Use this to keep the default headers in place.

exposedHeaders - a strings array of exposed headers (‘Access-Control-Expose-Headers’). Defaults to [‘WWW-Authenticate’, ‘Server-Authorization’].

additionalExposedHeaders - a strings array of additional headers to exposedHeaders. Use this to keep the default headers in place.

credentials - if true, allows user credentials to be sent (‘Access-Control-Allow-Credentials’). Defaults to false.

To enable CORS (Access-Control-Allow-Origin) for a single route we can add the cors property to route.options object:

server.route({
    method: 'GET',
    path: '/index',
    options: {
        cors: true,
        handler: async (req, h) => {
            return "OK";
        }
    }
});

To enable CORS (Access-Control-Allow-Origin) for all routes in Hapi server we can set the cors value to true:

const server = Hapi.server({
    port: 9000,
    host: 'localhost',
    routes: {
        cors: {
            origin: ['*'] // an array of origins or 'ignore'           
        }
    }
});

Access-Control-Allow-Credentials

With error: Credentials flag is ‘true’, but the ‘Access-Control-Allow-Credentials’ header is ‘’. It must be ‘true’ to allow credentials. Origin ‘http://localhost:9000’ is therefore not allowed access.

You need to set the credentials option on your route cors setting:

const server = Hapi.server({
    port: 9000,
    host: 'localhost',
    routes: {
        cors: {
            origin: ['*'], // an array of origins or 'ignore'    
            credentials: true // boolean - 'Access-Control-Allow-Credentials'
        }
    }
});

for config each route:

server.route({
    method: 'GET',
    path: '/index',
    options: {
        cors: {
            credentials: true
        },
        handler: async (req, h) => {
            return "OK";
        }
    }
});

Happy coding!

#hapi #hapi-js #javascript #nodejs

How to Enable Cross Origin Access Control in Hapi.JS

What is HAPI token?

HAPI: Onchain Cybersecurity Protocol for DeFi projects

Teaser

Launching any DeFi product is similar to launching a rocket: after the rocket takes off, you have a minimal toolset to influence its flight. You can send commands or even update the software. However, any unforeseen event could lead to a disaster, and you have no way of influencing it any further. You become a passive observer.

DeFi is similar to this in many ways. You create code, conduct a security audit, launch your smart contract into space (blockchain) and start praying that everything goes according to plan.

How do cybersecurity risks occur at DeFi?

Before we introduce HAPI, let’s have a look on how most DeFi projects work and what kind of security issues might arise.

  1. Blockchain:

A Blockchain is a database stored on multiple computers at once. And all of these computers are verifying that no one deceives one another and all of the records within this database are correct. A smart contract is a program that can be run within this database.

Example #1: 0x1111 is Alex’s wallet. We can write a smart contract crediting 10 HAI tokens to Alex if he has 10 ETH in his wallet. Every time Alex runs this contract, 10 HAI tokens will be sent to his wallet (as long as there are enough tokens on the smart contract). In this case, the program will verify whether there are 10 ETH on Alex’s wallet every time.

Example #2: 0x1111 is Alex’s wallet. We can write a smart contract crediting 10 HAI tokens to Alex if the price of gold on stock exchange is higher than $2000.

However, where can the smart contract get the price of gold from?

This is one of the big challenges in building smart contracts — we can use only the on-chain data in smart contracts’ implementation (only those that are already in our distributed database).

So, how can we record this data into the blockchain?

2. Oracles:

This is how Oracles have appeared — servers recording our necessary data onto the blockchain. Smart contract defines what kind of data it needs in blockchain. Oracles monitor these requests by taking the information from the outside world (usually via API) and recording it onto the blockchain.

However, this is where security issues might arise. Smart contracts are not aware of where the information is coming from and how reliable it is.

3. API or Application Programming Interface:

An API is an interface we can use to interact with programs, apps or devices. You can login into the bank’s client app and it will show you your balance by connecting to the Bank’s server via an API. You can also launch Coingecko’s mobile app and use the API to show you cryptocurrencies. In this case, the request is sent in a very precise form (if you want to receive the required information — learn to ask the right questions).

This is what we get — the user launches a smart contract, it contacts the Oracle’s smart contract and requests data. Oracles (servers) contact the required place (bank, exchange) via API, receive the necessary information and record it into the blockchain.

An onchain cybersecurity protocol to create trustless Oracles

HAPI is a set of cross chain smart contracts that are embedded into DeFI products that allow them to reach a new security level. Also, HAPI’s Oraclizing and DAO system delivers SaaS in the DeFi environment that prevents hack attempts.

How does HAPI work?

Image for post

Who is a Data Provider?

The main Data Provider is selected by the voting process in HAPI. It analyzes and marks all of the suspicious addresses. This data provider becomes the main provider of information to the blockchain. Upon request from exchanges (via API), service records all of the suspicious addresses into the blockchain and their ban period varies from 12 hours to a permanent ban.

HAPI example usecase: blocking the movement of stolen coins between DeFi and exchanges

Let’s say a hacker breaks into an exchange’s hot wallet and begins to transfer funds out of the exchange.

The exchange sends the address and coin details immediately to HAPI.

Every exchange connected to HAPI receives this information almost instantly and can block these transactions and funds until the situation is resolved. DEXs use smart contracts, allowing them to reject requests from suspicious addresses using HAPI. The momentum of the attack is slowed, and a portion of the funds is blocked.

Key points

  • Will be built for most popular blockchains (Ethereum, Vechain, Polkadot etc.)
  • All DeFi projects will substantially increase their security, if add HAPI module
  • HAPI is to become a security standard for DEXs, lending protocols, derivatives protocols and other DeFi classes
  • The data provider is voted via a DAO
  • The cost of reputational loss to a Data Provider is significantly higher than the potential damage caused by false data
  • The data would be onchain. Publically available
  • Request to change or add additional data will have a fee

HAPI token

The HAPI token is an ERC20 token minted on the Ethereum blockchain.

Key utility of HAPI is to circulize between data submitors and security oracles.

Utility

Users stake HAPI tokens to be able to participate in the projects governance. The governance is conducted by a voting procedure. The voting involves staking HAPI tokens to support or reject voting proposals.

HAPI holders in fact act as the whole DeFI industry security arbitres. Selecting the trusted oracles defines the direction and the speed of crypto mass adoption.

HAI token holders are also able to stake their tokens to receive HAPI tokens as a reward. The total supply of HAPI token increases over time following an inflation model (see below). This supply is accumulated and distributed among HAI token holders as long as they have their tokens staked. One can vote by staking with his or her HAPI tokens.

Every transaction to be submitted in the Security oracle database will require HAPI tokens that would be further sent for Oracles review work.

The Data Provider determines the final price according to the demand for the off-chain resource and similar information supply.

Key HAPI utilities

  • **Data submission fee. **Provides rights for the customer to submit any information connected with the hack or suspicious wallet.
  • **Governance. **Provides governance rights for the Users (DP election by DAO). Each HAPI token stakeholder can participate in governance conducted by a voting procedure. The voting involves staking HAPI tokens to support or reject voting proposals.
  • **Oracle rewards. **Serves as a payment method to Oracles for the review and audit work done on the submitted data.
  • **DeFi projects audit report submission. **DeFi projects will legitimize their code by submitting it to a unified audit reports data centre.

HAPI token distribution

HAI community members are the main beneficiaries from the HAPI project entering HFoundation.

HAI holders have a unique opportunity to enter the HAPI project at the most beneficial price.

HAPI token will also become exclusively available for farming by staking HAI on the HFoundation cross-blockchain staking platform. Image for post

How will the tokens be unlocked?

Image for post

HAPI Token Sale

Total Supply: 1,000,000 HAPI.

There will be three rounds of HAPI token sales.

The total number of tokens for sale is 480,000 HAPI.

Round #1 — HAI round

Our most exclusive and valuable round is reserved and held specifically for the Hacken community using HAI tokens. The price for the community is $5 per token. 240,000 HAPI tokens will be sold. The initial unlocking will represent 10% of the purchase amount and 10% monthly.

Joining the HAI round is simple. You send HAI (VIP180 VeChain) tokens from your address to the address that will be specified in the sale details. Tokens will be credited to a similar address on the ETH network. You will receive detailed instructions on how to open a wallet with the same address in the ETH network.

50% of all HAI tokens collected on the sale will be burned immediately. The rest will be locked up for two years.

Max amount per 1 VeChain address is 500,000 HAI.

Oversubscribed HAI would be returned to HAI holders proportionally.

Round #2 — Private round

The private round is held in ETH for strategic partners. The price for this round is $ 7.5 and 180,000 HAPI tokens will be sold. The initial unlocking will be 15% of the purchase amount and then 10% monthly. Minimum lot is 50k usd equivalent. Maximum is 100k USD.

Round #3 — Public round

The public round is held in ETH. The price for this round is $10. 60,000 HAPI tokens will be sold. The initial unlocking will be 20% of the purchase amount and 12% monthly.

Creating a Liquidity Pool

80,000 HAPI tokens and the corresponding amount of ETH at the public round price will be locked in the HAPI/ETH pool on Uniswap. Lock term is one year.

Farming HAPI token

200,000 HAPI tokens will be farmed over three years using HAI token staking. The accrual system is proportional. The farming system is uniform.

HAPI Opens Whitelisting for Poolz IDO Participation

Recently, we announced that the HAPI Initial DEX Offering (IDO) is scheduled for March 10, 2021. As we’re getting closer to our IDO date, we announce that we’re opening whitelisting for Poolz IDO participation.

Poolz relies on a whitelisting process to ensure that all investors get a fair opportunity to invest in the IDOs. We explained the whitelisting process in the previous announcement and below are the** basic requirements for you to be eligible for HAPI IDO whitelisting.**

How the IDO whitelisting works

To participate in whitelisting for the HAPI IDO on Poolz:

1. Follow Poolz on Twitter

2. Follow HAPIon Twitter

3. Retweet the pinned Tweet on the HAPI Twitter account. Make sure you mention the cashtags $POOLZ and $HAPI

4. Join the HAPI Telegram groups

5. Join the Poolz Official Telegram group

6. Fill in the whitelist form

7. Stake your POOLZ tokens in the staking page for 7 days with 12%APY guaranteed

source: EverythingAltcoin

Would you like to earn HAPI right now! ☞ CLICK HERE

How and Where to Buy HAPI token ?

HAPI has been listed on a number of crypto exchanges, unlike other main cryptocurrencies, it cannot be directly purchased with fiats money. However, You can still easily buy this coin by first buying Bitcoin, ETH, USDT from any large exchanges and then transfer to the exchange that offers to trade this coin, in this guide article we will walk you through in detail the steps to buy HAPI

You will have to first buy one of the major cryptocurrencies, usually either Bitcoin (BTC), Ethereum (ETH), Tether (USDT)…

We will use Binance Exchange here as it is one of the largest crypto exchanges that accept fiat deposits.

Binance is a popular cryptocurrency exchange which was started in China but then moved their headquarters to the crypto-friendly Island of Malta in the EU. Binance is popular for its crypto to crypto exchange services. Binance exploded onto the scene in the mania of 2017 and has since gone on to become the top crypto exchange in the world.

Once you finished the KYC process. You will be asked to add a payment method. Here you can either choose to provide a credit/debit card or use a bank transfer, and buy one of the major cryptocurrencies, usually either Bitcoin (BTC), Ethereum (ETH), Tether (USDT)

SIGN UP ON BINANCE

Step by Step Guide : What is Binance | How to Create an account on Binance (Updated 2021)

Next step - Transfer your cryptos to an Altcoin Exchange

Since HAPI is an altcoin we need to transfer our coins to an exchange that HAPI can be traded. Below is a list of exchanges that offers to trade HAPI in various market pairs, head to their websites and register for an account.

Once finished you will then need to make a BTC/ETH/USDT deposit to the exchange from Binance depending on the available market pairs. After the deposit is confirmed you may then purchase HAPI from the exchange.

Exchange: Gate.io

SIGN UP ON GATE.IO

Apart from the exchange(s) above, there are a few popular crypto exchanges where they have decent daily trading volumes and a huge user base. This will ensure you will be able to sell your coins at any time and the fees will usually be lower. It is suggested that you also register on these exchanges since once HAPI gets listed there it will attract a large amount of trading volumes from the users there, that means you will be having some great trading opportunities!

Top exchanges for token-coin trading. Follow instructions and make unlimited money

https://www.binance.com
https://www.bittrex.com
https://www.poloniex.com
https://www.bitfinex.com
https://www.huobi.com
https://www.mxc.ai
https://www.probit.com
https://www.gate.io
https://www.coinbase.com

Find more information HAPI

WebsiteSocial ChannelSocial Channel 2Social Channel 3Coinmarketcap

I hope this post will help you. If you liked this, please sharing it with others. Thank you!

#bitcoin #crypto #hapi

What is HAPI token?

Using TypeScript with Hapi

What I learned about using TypeScript and hapi together.

I’ve been using hapi lately, and decided to start using TypeScript at the same time. When I looked though there didn’t seem to be a lot out there on using them both together. Here’s what I learned.

I’m going to assume you have some degree of familiarity with JavaScript, along with a basic understanding of what TypeScript is.

If that’s not the case then I can would definitely recommend reading the MDN JavaScript tutorials, followed by the 5 minute TypeScript introduction.

I’ve chosen to use yarn in the examples below; if you’re using npm instead, just change yarn add to npm install.

I’ve included what you need to get the system up and running, and tried to explain as we go, but this post isn’t going to go in depth into any particular point. I’ve tried to include relevant links as we go, but if I’ve missed something out then definitely let me know and I’ll try to cover it further.

#hapijs #hapi #typescript #programming #javascript

Using TypeScript with Hapi
Kieran  Stroman

Kieran Stroman

1592633079

Node performance: Hapi, Express.js, Restify

We ran a simple test that closely matches our use case for our API. That means receiving a request and giving a response (in our test, this is just ‘hello world’). For our purposes, this makes sense.

If you’re reading this post with a mind to what framework to choose for building a website, you’re likely doing yourself a disservice. These frameworks offers a lot more to folks building rich web applications and what those features are is likely more important than raw throughput.

We’ve tested Hapi, ExpressJS, Restify and naked Node with no middleware.

We also identified that Restify keeps connections alive which removes the overhead of creating a connection each time when getting called from the same client. To be fair, we have also tested Restify with the configuration flag of closing the connection. You’ll see a substantial decrease in throughput in that scenario for obvious reasons.

#node #hapi #express.js #restify #programming

Node performance: Hapi, Express.js, Restify
NA SA

NA SA

1583745801

How to enable cross origin access control with Hapi.js

const Hapi = require(“hapi”);

const server = new Hapi.Server({
“host”: “localhost”,
“port”: 3000,
routes: {
cors: {
origin: [‘*’]
}
}
});

server.route({
method: “GET”,
path: “/”,
handler: (request, h) => {
return h.response(“Hello World”);
}
});

server.start().then(success => {
console.log("Listening at " + server.info.uri);
}, error => {
throw error;
});

#javascript #nodejs #hapi #hapijs

Kieran  Stroman

Kieran Stroman

1592644080

Hapi vs. Express in 2019: Node.js Framework Comparison

Here at Raygun, before we implement any new tool, we always run performance tests and like to share the results. This time we’re comparing Hapi vs. Express to help you make a more informed choice on Node.js frameworks.

Node has become a staple of the software development industry. The increased popularity of JavaScript over the past several years has propelled Node forward in terms of installed base. JavaScript and Node offer, perhaps for the first time, the opportunity to develop entire n-tier applications using a single language.

Node is fast, and JavaScript is everywhere. It’s a perfect match.

As it is with all web platforms, Node provides the essentials for an application: request and response objects, methods to manipulate HTTP requests, and more (but not much more.) “Pure” Node apps are wicked fast, but they lack in the supporting cast of middleware, routing, and plugins that reduce the amount of code needed for a modern web application. This is where web frameworks shine.

This is the story of the battle between two popular frameworks: Hapi and Express.

#hapi #express #node.js

Hapi vs. Express in 2019: Node.js Framework Comparison
Kieran  Stroman

Kieran Stroman

1592670300

Node.js performance vs Hapi, Express, Restify, Koa & More

In 2015 and 2016, Raygun has tested the Node.js framework against other popular frameworks including Hapi, Express.js, Restify and Koa. This year (2017), we’ve added some more frameworks due to popular request; Sails.js and Adonis.js.

The aim of these performance tests is to help you benchmark popular frameworks so you can see which one best suits your project.

As always, we’ve broken the results down and compared them to last year. We’ve also included instructions on how to reproduce the test.

Node.js performance tests were performed on the Ubuntu subsystem on Windows 10, along with a VM provisioned from Digital Ocean.

The tests only utilize the most basic capabilities of the frameworks in question, therefore the main goal was to show the relative overhead these frameworks add to the handling of a request. This is not a test of the absolute performance as this will vary greatly depending on the environment and network conditions.

This test also doesn’t cover the utility each framework provides and how this enables complex applications to be built with them.

#node.js #hapi #express #restify #koa #programming

Node.js performance vs Hapi, Express, Restify, Koa & More
Kieran  Stroman

Kieran Stroman

1592659080

Node.js performance vs Hapi, Express, Restify, Koa & More

In 2015 and 2016, Raygun has tested the Node.js framework against other popular frameworks including Hapi, Express.js, Restify and Koa. This year (2017), we’ve added some more frameworks due to popular request; Sails.js and Adonis.js.

The aim of these performance tests is to help you benchmark popular frameworks so you can see which one best suits your project.
As always, we’ve broken the results down and compared them to last year. We’ve also included instructions on how to reproduce the test.

Node.js performance tests were performed on the Ubuntu subsystem on Windows 10, along with a VM provisioned from Digital Ocean.

The tests only utilize the most basic capabilities of the frameworks in question, therefore the main goal was to show the relative overhead these frameworks add to the handling of a request. This is not a test of the absolute performance as this will vary greatly depending on the environment and network conditions.

This test also doesn’t cover the utility each framework provides and how this enables complex applications to be built with them.

Here’s a quick recap of what we found in our previous Node.js performance tests in 2015 and 2016.

#hapi #express #restify #koa #node.js #programming

Node.js performance vs Hapi, Express, Restify, Koa & More
Verda  Conroy

Verda Conroy

1593208860

Node.js performance vs Hapi, Express, Restify, Koa & More

In 2015 and 2016, Raygun has tested the Node.js framework against other popular frameworks including Hapi, Express.js, Restify and Koa. This year (2017), we’ve added some more frameworks due to popular request; Sails.js and Adonis.js.

The aim of these performance tests is to help you benchmark popular frameworks so you can see which one best suits your project.

#node #node.js #hapi #express #restify #koa

Node.js performance vs Hapi, Express, Restify, Koa & More
Pink  Rosenbaum

Pink Rosenbaum

1592423820

Node.js performance vs Hapi, Express, Restify, Koa & More

In 2015 and 2016, Raygun has tested the Node.js framework against other popular frameworks including Hapi, Express.js, Restify and Koa. This year (2017), we’ve added some more frameworks due to popular request; Sails.js and Adonis.js.

The aim of these performance tests is to help you benchmark popular frameworks so you can see which one best suits your project.

As always, we’ve broken the results down and compared them to last year. We’ve also included instructions on how to reproduce the test.

Node.js performance tests were performed on the Ubuntu subsystem on Windows 10, along with a VM provisioned from Digital Ocean.

The tests only utilize the most basic capabilities of the frameworks in question, therefore the main goal was to show the relative overhead these frameworks add to the handling of a request. This is not a test of the absolute performance as this will vary greatly depending on the environment and network conditions.

This test also doesn’t cover the utility each framework provides and how this enables complex applications to be built with them.

#node.js #express #hapi #restify #koa #more

Node.js performance vs Hapi, Express, Restify, Koa & More
Shad  Blanda

Shad Blanda

1589082245

Setup a powerful API with Nodejs, GraphQL, MongoDB, Hapi, and Swagger

Set-up a powerful API with Nodejs, GraphQL, MongoDB, Hapi, and Swagger. Separating your frontend and backend has many advantage

#javascript #nodejs #graphql #mongodb #hapi

Setup a powerful API with Nodejs, GraphQL, MongoDB, Hapi, and Swagger
Thomas  Granger

Thomas Granger

1574362302

How to build a RESTful API with Hapi.js

Hapi.js (also known as hapi) is an open-source framework for Web Applications. The most common use of Hapi is to build web services such as JSON API. You can build application programming interface (API) servers, websites, and HTTP proxy applications with hapi.js.

Table of Contents

  • Introduction
  • Introduction to the RESTful Architecture
  • Getting Started
  • Setting up package.json
  • Setting up Knex
  • Creating the Migrations
  • Starting the API
  • Conclusion

Introduction

As the century dawns, we see new technologies and architectures which companies, startups and developers, alike, are using to power their next big application; some of these architectures have been so throughly battle-tested, that some companies are even scrapping old application just so that they can implement this modern, new (and sclable) approach.

One of these architectures is the Representational State Transfer or REST as the people call it. In this architecture, the server talks in terms of resources, and then uses HTTP verbs to perform actions on those resources. A quick detour of this architecture has been taken in a following section.

With Node.js, it’s not doubt that it’s super-easy to implement a quick RESTful API and be up and running in no time; however, there are quite a few considerations before someone thinks of implementing a scalable, high-efficiency RESTful API.

Why am I using Hapi.js?

Honestly, I am a HUGE fan of the Hapi.js framework, and the Hapi.js community. Cheers to all those who have contributed to this remarkable project.

Hapi.js makes things so simple, and so smooth to work with; all that without compromising on the reliability or efficiency of the framework. In case you haven’t read it yet, I have a small article here which, as mentioned earlier, covers just enough material to get your feet wet and kicking for this project.

Goals

Now, before we start out work on this API, let’s set a couple of goals. Here the word ‘goals’ is like the success criterion of your application. Since this is a project to help you get up to speed with Hapi.js, we’ll really add the features which are unique: display or implement some of Hapi.js’ philosophy or concepts. It’ll also trim the dead wood from our experimental code-base.

Primary API Function

This API is the Developer API for a startup which returns the birda spotted in a particular area.

Logical Goals

  • All users can see every public birda in the database;
  • All users can login/out;
  • Registered users can create birds;
  • Registered users can edit birds provided they own that bird;
  • Registered users can delete the birds.

I presume everything in this section is quite self-explanatory. However, if you are confused about something, feel free to drop us a comment in the comments section of this post.

Introduction to the RESTful Architecture

In a RESTfully-architectured web application, you let the HTTP verbs (like GET, POST, etc.) do the work of telling your application what to do. For this tutorial, we’ll have routes something like the following:

  • http://api.app.com/birds (GET) - to get a list of all the public birds
  • http://api.app.com/birds (POST) - to create a new bird
  • http://api.app.com/birds/:id (GET) - to get a specific bird

This is something quite standard of any RESTful API. There is a pretty good post on Scotch about designing APIs with RESTful architecture in mind. Take a look.

Getting Started

With everything said, let’s dig right into it. To test the API, I will be using this fantastic application called Paw. Alternatively, you can use ARC or Postman App.

Tools of Choice

To create this awesome API, we’ll be using a couple of very interesting Node.js packages.

Knex.js

Knex is a very simple to use, yet incredibly powerful query builder for MySQL and a plethora of other RDBMS. We’ll use this to directly communicate with our Authentication and Data servers running MySQL.

Hapi.js

Hapi (pronounced “happy”) is a web framework for building web applications, APIs and services. It’s extremely simple to get started with, and extremely powerful at the same time. The problem arises when you have to write perfomant, maintable code.

Alright, perfect. Now we understand the nuances of this application and we can begin coding.

Setting up package.json

Initialize a new package.json file with npm init in your root folder and then filling the values as required. The following is my package.json:

{
  "name": "birdbase",
  "version": "1.0.0",
}

Now, let’s install mysql, jsonwebtoken, hapi-auth-jwt, and knex with

npm i --save mysql jsonwebtoken hapi-auth-jwt knex

Note that this configuration is based on the configuration of my previous article. I haven’t included everything and this is just a build up on that. Be sure to follow that one before this.

Now, our package.json looks something like:

{
  "name": "birdbase",
  "version": "1.0.0",
  "devDependencies": {
    "babel-core": "^6.20.0",
    "babel-preset-es2015": "^6.18.0"
  },
  "dependencies": {
    "hapi": "^16.0.1",
    "hapi-auth-jwt": "^4.0.0",
    "jsonwebtoken": "^7.2.1",
    "knex": "^0.12.6",
    "mysql": "^2.12.0"
  },
  "scripts": {
    "start": "node bootstrap.js"
  }
}

And the following is the directory structure:

Making a RESTful API with Hapi.js

Perfect. Now, let’s configure Knex so we can begin working with it.

Setting up Knex

Knex is just brilliant. And we will see the reasons to that brilliance in just a minute. Start by installing the knex cli with sudo npm install -g knex. This tools allows us to programatically create a MySQL table structure and then execute it. Generally, we call this migrations.

For the rest of the tutorial, the following is my MySQL configuration:

MySQL Host: 192.168.33.10
MySQL User: birdbase
MySQL Pass: password
MySQL DB Name: birdbase

Create a knexfile.js in the root of the directory with the following content:

module.exports = {

    development: {

        migrations: { tableName: 'knex_migrations' },
        seeds: { tableName: './seeds' },

        client: 'mysql',
        connection: {

            host: '192.168.33.10',

            user: 'birdbase',
            password: 'password',

            database: 'birdbase',
            charset: 'utf8',

        }

    }

};

And create a new folder called seeds in the root directory. The knexfile.js is used by the Knex CLI to perform SQL operations. The seeds directory will contain our seeds or initial data which we can use for testing. Trust me when I say this, having this at hand, greatly simplifies development as you already have the data you want to work with.

The structure should look something like the following:

Making a RESTful API with Hapi.js

Creating the Migrations

Let’s create the actual migrations now. We’ll create two tables users and birds. The users table will contain the username, password, name and email of the users; and the birds table will contain the listings of birds.

Create a new migration with knex migrate:make Datastructure to create a new migration file. It’ll look something like 20161211185139_Datastructure.js. In your favorite text editor, open it and you’ll see something like:

exports.up = function(knex, Promise) {
};

exports.down = function(knex, Promise) {
};

The up function is executed when you migrate a database for this, and the down function is executed when you rollback.

Add the following to the up function:

exports.up = function(knex, Promise) {

    return knex
            .schema
            .createTable( 'users', function( usersTable ) {

                // Primary Key
                usersTable.increments();

                // Data
                usersTable.string( 'name', 50 ).notNullable();
                usersTable.string( 'username', 50 ).notNullable().unique();
                usersTable.string( 'email', 250 ).notNullable().unique();
                usersTable.string( 'password', 128 ).notNullable();
                usersTable.string( 'guid', 50 ).notNullable().unique();

                usersTable.timestamp( 'created_at' ).notNullable();

            } )

            .createTable( 'birds', function( birdsTable ) {

                // Primary Key
                birdsTable.increments();
                birdsTable.string( 'owner', 36 ).references( 'guid' ).inTable( 'users' );

                // Data
                // Each chainable method creates a column of the given type with the chained constraints. For example, in the line below, we create a column named `name` which has a maximum length of 250 characters, is of type string (VARCHAR) and is not nullable. 
                birdsTable.string( 'name', 250 ).notNullable();
                birdsTable.string( 'species', 250 ).notNullable();
                birdsTable.string( 'picture_url', 250 ).notNullable();
                birdsTable.string( 'guid', 36 ).notNullable().unique();
                birdsTable.boolean( 'isPublic' ).notNullable().defaultTo( true );

                birdsTable.timestamp( 'created_at' ).notNullable();

            } );

};

The chain .references(...) is used to create a composite primary key. This is done to ensure that we know who owns which listing.

In the down function, add the following:

exports.down = function(knex, Promise) {

    // We use `...ifExists` because we're not sure if the table's there. Honestly, this is just a safety measure. 
    return knex
        .schema
            .dropTableIfExists( 'birds' )
            .dropTableIfExists( 'users' );

};

Remember to drop a referencing table first; i.e., a table which uses field-referencing. In this case, birds had referenced guid in the table users, and so, we remove birds before we remove users as doing it the other way round will throw an error.

Let’s run this migration with knex migrate:latest.

If all goes well, you should see:

Making a RESTful API with Hapi.js

Now, if you head over to phpMyAdmin and check your database, you should see something like the following:

Making a RESTful API with Hapi.js

Looks great to me. We’ll now create some seed files by running knex seed:make 01_Users. Remember that the seed files are executed in the order of their file name, and so, if you execute the seed for the birds table first, the key constraint (reference) to guid in user will fail because the latter doesn’t exist.

Under the seed folder, you should now see a new file titled 01_Users.js; open it, and replace the code with the following:

exports.seed = function seed( knex, Promise ) {

    var tableName = 'users';

    var rows = [

        // You are free to add as many rows as you feel like in this array. Make sure that they're an object containing the following fields:
        {
            name: 'Shreyansh Pandey',
            username: 'labsvisual',
            password: 'password',
            email: '[email protected]',
            guid: 'f03ede7c-b121-4112-bcc7-130a3e87988c',
        },

    ];

    return knex( tableName )
        // Empty the table (DELETE)
        .del()
        .then( function() {
            return knex.insert( rows ).into( tableName );
        });

};

The code is self-explanatory, so I wouldn’t bother going deeper. Similarly, let’s create a sample bird migration with: knex seed:make 02_Birds and replacing the file with the following code:

exports.seed = function seed( knex, Promise ) {

    var tableName = 'birds';

    var rows = [

        {
            owner: 'f03ede7c-b121-4112-bcc7-130a3e87988c',
            species: 'Columbidae',
            name: 'Pigeon',
            picture_url: 'http://pngimg.com/upload/pigeon_PNG3423.png',
            guid: '4c8d84f1-9e41-4e78-a254-0a5680cd19d5',
            isPublic: true,
        },

        {
            owner: 'f03ede7c-b121-4112-bcc7-130a3e87988c',
            species: 'Zenaida',
            name: 'Mourning dove',
            picture_url: 'https://upload.wikimedia.org/wikipedia/commons/thumb/b/b7/Mourning_Dove_2006.jpg/220px-Mourning_Dove_2006.jpg',
            guid: 'ddb8a136-6df4-4cf3-98c6-d29b9da4fbc6',
            isPublic: false,
        },

    ];

    return knex( tableName )
        .del()
        .then( function() {
            return knex.insert( rows ).into( tableName );
        });

};

Execute these seeds with knex seed:run and then open phpMyAdmin to be amazed.

Making a RESTful API with Hapi.js

Making a RESTful API with Hapi.js

Making a RESTful API with Hapi.js

Beautiful. Now we can move onto creating the actual API.

Starting the API

JWT Authentication The authentication provider. In this case, we’ll be using the super-simple and secure JSON Web Token strategy for authentication and authorization. Before moving forward, we need to do JWT-101.

A JWT is in the form xxxxx.yyyyy.zzzzz with each of the sections having a specific name. We’ll consider the token:

eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.
eyJ1c2VybmFtZSI6ImxhYnN2aXN1YWwiLCJzY29wZSI6ImFkbWluIiwiaWF0IjoxNDgxMzg0NDQ3LCJleHAiOjE0ODEzODgwNDd9.
Y7B8rvGNmkwrSWMlb5e1Bqz0qnLuDLxerZmmdtg8ouo

The first block (xxxxx) is the header of the token and contains metadata such as the algorithm used for signature, etc. In our token, on decoding with UTF8 encoding, we get the following content

{
  "alg":"HS256",
  "typ":"JWT"
}

The next block (yyyyy) is the payload of the token and contains claims, expiration and creation metadata. Technically speaking, you can have whatever you want in this section. For our example, we get the following decoded content

{
  "username": "labsvisual",
  "scope": "admin",
  "iat": 1481384447,
  "exp": 1481388047
}

As you can make out, this token was created for the user labsvisual who has admin scope and expires in 1h. Simple.

The last block (zzzzz) is the signature of the token and is calculated using HMAC256( ( base64Encode( header ) + '.' + base64Encode( payload ) ), secretKey ). You define the secret on the server; the library (jsonwebtoken) signs and verified the tokens using this secret. No matter what happens, make sure you do not leak this token anywhere as it will cause a problem in the authentication framework of your application.

A great tool to interactively debug and learn about JWT is its official website at JWT and the token debugger here. Below is a screenshot of the interactive JWT debugger in action.

Making a RESTful API with Hapi.js

Setting Up JWT Before we do anything, we need to tell hapi that we’re going to use an authentication strategy (method) and so, it should load a couple of modules. Open up server.js and enter the following just after server.connection(...

// .register(...) registers a module within the instance of the API. The callback is then used to tell that the loaded module will be used as an authentication strategy. 
server.register( require( 'hapi-auth-jwt' ), ( err ) => {
    server.auth.strategy( 'token', 'jwt', {

        key: 'vZiYpmTzqXMp8PpYXKwqc9ShQ1UhyAfy',

        verifyOptions: {
            algorithms: [ 'HS256' ],
        }

    } );

} );

Here, we ask the server object to register a new module from the package hapi-jwt-auth; afterwhich, we register a new authentication strategy called token with the jwt scheme and the following options:

  • key - this is the private key which is used to sign and verify the JWT signatures;
  • verifyOptions - we tell the library which algorithm to use for signature and verification; HMAC256 in this case.

You can also add validateFunc to the block which is used to validate the token provided. This is optional and is used when you have to do some other sort of verification in addition to the cryptographic verification provided by the library.

src/server.js should look like the following now

import Hapi from 'hapi';

const server = new Hapi.Server();

server.connection( {
    port: 8080
} );

server.register( require( 'hapi-auth-jwt' ), ( err ) => {
    server.auth.strategy( 'token', 'jwt', {

        key: 'vZiYpmTzqXMp8PpYXKwqc9ShQ1UhyAfy',

        verifyOptions: {
            algorithms: [ 'HS256' ],
        }

    } );

} );

server.start( err => {

    if( err ) {

        // Fancy error handling here
        console.error( 'Error was handled!' );
        console.error( err );

    }

    console.log( `Server started at ${ server.info.uri }` );

} );

Routes

Now, we can add the routes. Let’s start by adding a simple route which gets all the public birds. Within src/server.js add the following route:

server.route( {

    path: '/birds',
    method: 'GET',
    handler: ( request, reply ) => {    
    }

} );

Now, let’s create a knex instance. Add a file knex.js within src and add the following code to it:

export default require( 'knex' )( {

    client: 'mysql',
    connection: {

        host: '192.168.33.10',

        user: 'birdbase',
        password: 'password',

        database: 'birdbase',
        charset: 'utf8',

    }

} );

Then import this file into your server.js by import Knex from './knex';. Now we are ready to utilize this awesome library.

Let’s select the name, picture_url and species for every public bird. In the handler for your route, add:

...
handler: ( request, reply ) => {

    // In general, the Knex operation is like Knex('TABLE_NAME').where(...).chainable(...).then(...)
    const getOperation = Knex( 'birds' ).where( {

        isPublic: true

    } ).select( 'name', 'species', 'picture_url' ).then( ( results ) => {

        if( !results || results.length === 0 ) {

            reply( {

                error: true,
                errMessage: 'no public bird found',

            } );

        }

        reply( {

            dataCount: results.length,
            data: results,

        } );

    } ).catch( ( err ) => {

        reply( 'server-side error' );

    } );

}
...

The line = Knex('... tells Knex to use the birds database, and then builds the query where the field isPublic is set to true. Then fetches it using .select (which returns a promise) and then resolving the promise. The parameter results is an array of all the birds which match the criterion.

Save the file and start the API server with npm start and then fire up your favourite API client. We’ll use Paw.

Making a RESTful API with Hapi.js

Pat yourself on the back if everything works as expected. Your src/server.js file should look like:

import Hapi from 'hapi';
import Knex from './knex';

const server = new Hapi.Server();

server.connection( {
    port: 8080
} );

server.register( require( 'hapi-auth-jwt' ), ( err ) => {
    server.auth.strategy( 'token', 'jwt', {

        key: 'vZiYpmTzqXMp8PpYXKwqc9ShQ1UhyAfy',

        verifyOptions: {
            algorithms: [ 'HS256' ],
        }

    } );

} );

// --------------
// Routes
// --------------

server.route( {

    path: '/birds',
    method: 'GET',
    handler: ( request, reply ) => {

        const getOperation = Knex( 'birds' ).where( {

            isPublic: true

        } ).select( 'name', 'species', 'picture_url' ).then( ( results ) => {

            // The second one is just a redundant check, but let's be sure of everything.
            if( !results || results.length === 0 ) {

                reply( {

                    error: true,
                    errMessage: 'no public bird found',

                } );

            }

            reply( {

                dataCount: results.length,
                data: results,

            } );

        } ).catch( ( err ) => {

            reply( 'server-side error' );

        } );

    }

} );

server.start( err => {

    if( err ) {

        // Fancy error handling here
        console.error( 'Error was handled!' );
        console.error( err );

    }

    console.log( `Server started at ${ server.info.uri }` );

} );

Now, we’ll continue by adding an auth route which will be used to authenticate the user. The logic here is very simple: check if the password of the payload is the same as the one in the database, and if so, create a new JWT token with the scope of the user’s GUID which expires in 1h.

While updating a bird, we’ll use a preRouteHandler to check if the current user owns the bird; if he does, then we’ll allow the edit, otherwise, we’ll throw a 403 error.

Auth Route

Let’s create a POST route with

server.route( {

    path: '/auth',
    method: 'POST',
    handler: ( request, reply ) => {

        // This is a ES6 standard
        const { username, password } = request.payload;

        const getOperation = Knex( 'users' ).where( {

            // Equiv. to `username: username`
            username,

        } ).select( 'guid', 'password' ).then( ( results ) => {

        } ).catch( ( err ) => {

            reply( 'server-side error' );

        } );

    }

} );

The line const { username... decomposes the request.payload object and gets the named values (username and password in this case). This is the same as:

const username = request.payload.username;

Just another lovely example of why I love ES6.

Remember that the object request.payload contains all the content in a POST or a PUT request.

Let’s continue.

First we need to make sure that we select exactly one. Let’s use the array deconstructor and then check if it’s populated:

...
} ).select( 'guid', 'password' ).then( ( [ user ] ) => {
    if( !user ) {

        reply( {

            error: true,
            errMessage: 'the specified user was not found',

        } );

        // Force of habit. But most importantly, we don't want to wrap everything else in an `else` block; better is, just return the control.
        return;

    }
...

Simple enough. We check if the user exists and if not, we throw an error and exit out of the function. Let’s finish this route:

...
// Honestly, this is VERY insecure. Use some salted-hashing algorithm and then compare it.
if( user.password === password ) {

    const token = jwt.sign( {

        // You can have anything you want here. ANYTHING. As we'll see in a bit, this decoded token is passed onto a request handler.
        username,
        scope: user.guid,

    }, 'vZiYpmTzqXMp8PpYXKwqc9ShQ1UhyAfy', {

        algorithm: 'HS256',
        expiresIn: '1h',

    } );

    reply( {

        token,
        scope: user.guid,

    } );

} else {

    reply( 'incorrect password' );

}

The function jwt.sign( payload, key, [ options ] ) signs the payload and gives a JWT which we then transmit to the user. Save this file and start your server, and add a new request to your client.

Making a RESTful API with Hapi.js

Try changing the username and the password to see what kind of response you get.

Making a RESTful API with Hapi.js

Making a RESTful API with Hapi.js

So far, so good. Now we’ll add a method to create a bird, and a method to update a bird. Let’s start.

Create a Bird

Create a POST route at /birds and add the empty handler function.

server.route( {

    path: '/birds',
    method: 'POST',
    handler: ( request, reply ) => {

        const { bird } = request.payload;

    }

} );

We expect to have a payload as bird which is an object containing all the information about the bird. Let’s add the code to insert this into our database.

Before anything, we need to tell Hapi.js that this route is protected by authentication. To do so, add the following after method in the route:

...
method: 'POST',
config: {

    auth: {

        strategy: 'token',

    }

},
...

This tells Hapi.js that we’ll be using a registered authentication strategy for our route.

But, for this to work properly, we need to refractor the code a little bit. Let’s add a new file routes.js containing all the routes. Something like:

import Knex from './knex';
import jwt from 'jsonwebtoken';

// The idea here is simple: export an array which can be then iterated over and each route can be attached. 
const routes = [

    {

        path: '/birds',
        method: 'GET',
        handler: ( request, reply ) => {

            const getOperation = Knex( 'birds' ).where( {

                isPublic: true
...
export default routes;

Then, in src/server.js, we’ll import the routes array as import routes from './routes'; and then within server.register(... we’ll add the following bit to register all the routes:

...
    routes.forEach( ( route ) => {

        console.log( `attaching ${ route.path }` );
        server.route( route );

    } );

The src/server.js file becomes something like:

import Hapi from 'hapi';
import routes from './routes';

const server = new Hapi.Server();

server.connection( {
    port: 8080
} );

server.register( require( 'hapi-auth-jwt' ), ( err ) => {

    if( !err ) {
        console.log( 'registered authentication provider' );
    }

    server.auth.strategy( 'token', 'jwt', {

        key: 'vZiYpmTzqXMp8PpYXKwqc9ShQ1UhyAfy',

        verifyOptions: {
            algorithms: [ 'HS256' ]
        }

    } );

    // We move this in the callback because we want to make sure that the authentication module has loaded before we attach the routes. It will throw an error, otherwise. 
    routes.forEach( ( route ) => {

        console.log( `attaching ${ route.path }` );
        server.route( route );

    } );

} );

server.start( err => {

    if( err ) {

        // Fancy error handling here
        console.error( 'Error was handled!' );
        console.error( err );

    }

    console.log( `Server started at ${ server.info.uri }` );

} );

With that done, let’s move on.

Now, we need to add a bird to the birds database. We’ll do this using Knex. Add the following bit immediately after const { bird } = request.payload; in your route within src/routes.js.

We’ll install a package to generate GUID’s called node-uuid with npm i --save node-uuid and then import it into our routes file as import GUID from 'node-uuid';.

const guid = GUID.v4();

const insertOperation = Knex( 'birds' ).insert( {

    owner: request.auth.credentials.scope,
    name: bird.name,
    species: bird.species,
    picture_url: bird.picture_url,
    guid,

} ).then( ( res ) => {

    reply( {

        data: guid,
        message: 'successfully created bird'

    } );

} ).catch( ( err ) => {

    reply( 'server-side error' );

} );

The code is pretty self-explanatory apart from this interesting object request.auth.credentials. Well, after the verification is done, the authentication handler passes on the decoded token to this credentials object. If you do a console.log( request.auth.credentials ); you’ll see something like:

{
    username: 'labsvisual',
    scope: 'f03ede7c-b121-4112-bcc7-130a3e87988c',
    iat: 1481546651,
    exp: 1481550251
}

From here, we can grab on to the GUID for the user and pass it on as the owner in the database. Simple.

Fire up the server and then add a couple of requests. Remember to add the Authorization header in the following format: Authorization: Bearer <JWT> where <JWT> is your generated JWT in the /auth route.

Making a RESTful API with Hapi.js

Making a RESTful API with Hapi.js

Let’s check the database:

Making a RESTful API with Hapi.js

And now let’s get a listing of all public birds:

Making a RESTful API with Hapi.js

Update Birds

With that in place, we can create our last route: PUT at /birds/:guid where we can update the bird with the GUID guid. I’ll just copy and paste the POST route and make the changes as and when required.

For starters, let’s change the method to PUT and the route to /bird/{birdGuid}. We can access this birdGuid from request.params. After the changes, it should look something like:

{

        path: '/birds/{birdGuid}',
        method: 'PUT',
...

Now, we want to verify that the current user has rights to the bird he’s trying to edit. For that, we need to validate if the bird associated with birdGuid has the same owner as the scope of the authorization token passed. As illustrated before, we can access the token’s GUID from request.auth.credentials and then the scope property in that. However, let’s add a route prerequisite: this function has the same signature as the handler of a route and is executed before the control is passed to the handler. Really useful in cases like this where you need to do some sort of verification. Also note that we can safely assume that when this function is being executed, some user has passed some valid JWT in the authorization header; had they not done that, the hapi-jwt-auth library would’ve thrown a 401 error.

The route should be something like:

config: {
    ...
    pre: [
        {
            method: ( request, reply ) => {

            }
        }
    ]
...

The pre configuration block is an array which contains objects with a key method which is linked to a function.

We’ll first pull out all the values from the objects and store them:

...
const { birdGuid } = request.params
         , { scope }    = request.auth.credentials;
...

Next, let’s do a select operation on the database where we select the bird from the GUID provided; we’ll just select the owner column of the bird as that’s all we need for verification:

...
const getOperation = Knex( 'birds' ).where( {

    guid: birdGuid,

} ).select( 'owner' ).then( ( [ result ] ) => {

} );
...

Brilliant. Now, we have selected just one bird with [ result ] and we can work on that.

Let’s start by verifying that we actually have a bird with the specified GUID; if we do not, then we’ll take over the request and send a custom reply. reply().takeover() ends the reply chain with the last response you give, and hence, does not let the handler get envoked.

...
if( !result ) {

    reply( {

        error: true,
        errMessage: `the bird with id ${ birdGuid } was not found`

    } ).takeover();

}
...

Next, let’s check if the scope of the current token allows the user to modify the bird with the guid.

...
if( result.owner !== scope ) {

    reply( {

        error: true,
        errMessage: `the bird with id ${ birdGuid } is not in the current scope`

    } ).takeover();

}
...

If not, we’ll just let the reply chain continue:

...
return reply.continue();
...

After you’re done, this method should be something like:

...
method: ( request, reply ) => {

    const { birdGuid } = request.params
        , { scope }    = request.auth.credentials;

    const getOperation = Knex( 'birds' ).where( {

        guid: birdGuid,

    } ).select( 'owner' ).then( ( [ result ] ) => {

        if( !result ) {

            reply( {

                error: true,
                errMessage: `the bird with id ${ birdGuid } was not found`

            } ).takeover();

        }

        if( result.owner !== scope ) {

            reply( {

                error: true,
                errMessage: `the bird with id ${ birdGuid } is not in the current scope`

            } ).takeover();

        }

        return reply.continue();

    } );

}
...

Lastly, let’s add the update chain to the handler so we can UPDATE the current bird with the information.

...
handler: ( request, reply ) => {

    const { birdGuid } = request.params
        , { bird }     = request.payload;

    const insertOperation = Knex( 'birds' ).where( {

        guid: birdGuid,

    } ).update( {

        name: bird.name,
        species: bird.species,
        picture_url: bird.picture_url,
        isPublic: bird.isPublic,

    } ).then( ( res ) => {

        reply( {

            message: 'successfully updated bird'

        } );

    } ).catch( ( err ) => {

        reply( 'server-side error' );

    } );

}
...

The code is self-explanatory. So, we’ll just continue. Now, fire up the server and add a new request.

Making a RESTful API with Hapi.js

Making a RESTful API with Hapi.js

Let’s change the isPublic parameter:

Making a RESTful API with Hapi.js

Making a RESTful API with Hapi.js

You can also try giving incorrect birdGuid and seeing what happens:

Making a RESTful API with Hapi.js

Conclusion

In the end, we learned quite a few things here. We went from just being a beginner in Hapi.js to creating a fully blown API in Hapi with Authentication, MySQL DB, etc. The code is available here; if you find some errors, or have some suggestions, please be sure to tell me. if you have any questions, be sure to throw them in the comment box below. Until next time! Cheers!

#hapi.js #api #rest #webdev #javascript

How to build a RESTful API with Hapi.js
NA SA

NA SA

1583746610

How to make redirect 301 in Hapi.js

// The 301 or permanent redirect you can make in two ways in Hapi >=v17
server.route({
method: ‘GET’,
path: ‘/redirect’,
handler: (request, reply) {
return reply.redirect(‘https://morioh.com’).code(301);
}
});

// or

server.route({
method: ‘GET’,
path: ‘/permanent’,
handler: (request, reply) {
return reply.redirect(‘https://morioh.com’).permanent();
}
});

#javascript #nodejs #node-js #hapi-js #hapijs

Dylan  Iqbal

Dylan Iqbal

1570765619

How to set-up a powerful API with Nodejs, GraphQL, MongoDB, Hapi, and Swagger

Separating your frontend and backend has many advantages:

  • The biggest reason why reusable APIs are popular — APIs allow you to consume data from a web client, mobile app, desktop app — any client really.
  • Separation of concerns. Long gone are the days where you have one monolithic-like app where everything is bundled together. Imagine you have an extremely convoluted application. Your only option is to hire extremely experienced/senior developers due to the natural complexity.

I’m all for hiring juniors and training your staff, and that’s exactly why you should separate concerns. With separation of concerns, you can reduce the complexity of your application by splitting responsibilities into “micro-services” where each team is specialized in their micro-service.

As mentioned above, the on-boarding/ramp-up process is much quicker thanks to splitting up responsibilities (backend team, frontend team, dev ops team, and so on)


Forward thinking and getting started

We will be building a very powerful, yet flexible, GraphQL API based on Nodejs with Swagger documentation powered by MongoDB.

The main backbone of our API will be Hapi.js. We will go over all the technology in substantial detail.

At the very end, we will have a very powerful GraphQL API with great documentation.

The cherry on top will be our integration with the client (React, Vue, Angular)


Prerequisites

  • NodeJS installed
  • Basic JavaScript
  • Terminal (any will do, preferably bash-based)
  • Text editor (any will do)
  • MongoDB (install instructions here) — Mac: brew install mongodb

Let’s goo!

Open the terminal and create the project. Inside the project directory we initialize a Node project.

Creating our project

Next, we want to setup our Hapi server, so let’s install the dependencies. You can either use Yarn or NPM.

yarn add hapi nodemon

Before we go on, let’s talk about what hapi.js is and what it can do for us.

hapi enables developers to focus on writing reusable application logic instead of spending time building infrastructure.

Instead of going with Express, we are going with Hapi. In a nutshell, Hapi is a Node framework. The reason why I chose Hapi is rather simple — simplicity and flexibility over boilerplate code_._

Hapi enables us to build our API in a very rapid manner.

Optional: check out this quick crash course on hapi.js:

The second dependency we installed was the good-ole nodemon. Nodemon restarts our server automatically whenever we make changes. It speeds up our development by a big factor.

Let’s open our project with a text editor. I chose Visual Studio Code.

Setting up a Hapi server is very straightforward. Create a index.js file at the root directory with the contents of the following:

  • We require the hapi dependency
  • Secondly, we make a constant called server which creates a new instance of our Hapi server — as the arguments, we pass an object with the port and host options.
  • Third and finally, we create an asynchronous expression called init. Inside the init method, we have another asynchronous method which starts the server. See server.start() — at the bottom we call the init()function.

If you’re unsure about async await — watch this:

Now, if we head over to [http://localhost:4000](http://localhost:4000) we should see the following:

Which is perfectly fine, since the Hapi server expects a route and a handler. More on that in a second.

Let’s quickly add the script to run our server with nodemon. Open package.json and edit the scripts section.

Now we can do the following 😎


Routing

Routing is very intuitive with Hapi. Let’s say you hit / — what would you expect to happen? There are three main components in play here.

  • What’s the path? — path
  • What’s the HTTP method? Is it a GET — POST or something else? — method
  • What will happen if that route is reached? — handler

Inside the init method we attached a new method to our server called route with options passed as our argument.

If we refresh our page we should see return value of our root handler

Well done, but there is so much more we can do!


Setting up our database

Right, next up we are going to setup our database. We’re going to use mongodb with mongoose.

Let’s face it, writing MongoDB validation, casting and business logic boilerplate is a drag. That’s why we wrote Mongoose.

The next final ingredient related to our database is mlab. Instead of running mongo on our local computer, we are gonna use a cloud provider like mlab.

The reason why I chose mlab is because of the free plan (useful for prototyping) and how simple it is to use. There are more alternatives out there, and I encourage you to explore all of them ❤

Head over to https://mlab.com/ and signup.

Let’s create our database.

And finally create a user for the database. That will be all we will be editing on mlab.


Connecting mongoose with mlab

Open index.js and add the following lines and credentials. We are basically just telling mongoose which database we want to connect. Make sure to use your credentials.

If you want to brush up your MongoDB skills, here’s a solid series.

If everything went according the plan, we should see ‘connected to database’ in the console.


Wohoo!

Good job! Take a quick break and grab some coffee, we are almost ready to dive into the “cool parts”.


Creating Models

With mongoDB, we follow the convention of models. In other words — data modeling.

It’s a relatively simple concept which you will be able to grasp. Basically we just declare our schema for collections. Think of collections as tables in an SQL database.

Let’s create a directory called models. Inside we will create a file Painting.js

Painting.js is our painting model. It will hold all data related to paintings. Here’s how it will look:

  • We require the mongoose dependency.
  • We declare our PaintingSchema by calling the mongoose schema constructor and passing in the options. Notice how it’s strongly typed: for example the name field can consist of a string, and techniques consists of an array of strings.
  • We export the model and name it Painting

Let’s fetch all of our paintings from the database

First we need to import the Painting model to index.js


Adding new routes

Ideally, we want to have URL endpoints reflecting our actions.

such as /api/v1/paintings — /api/v1/paintings/{id} — and so on.

Let’s start off with a GET and POST route. GET fetches all the paintings and POST adds a new painting.

Notice we modified the route to be an array of objects instead a single object. Also, arrow functions 😊

  • We created a GET for [/api/v1/paintings](http://localhost:4000/api/v1/paintings) path. Inside the handler we are calling the mongoose schema. Mongoose has built-in methods — the handy method we are using is find() which returns all paintings since we’re not passing in any conditions to find by. Therefore it returns all records.
  • We also created a POST for the same path. The reason for that is we’re following REST conventions. Let’s deconstruct (pun intended) the route handler — remember in our Painting schema we declared three fields: name — url — techniques 
  • Here we are just accepting those arguments from the request (we will be doing that with postman in a sec) and passing the request arguments to our mongoose schema. After we’re done passing arguments, we call the save() method on our new record, which saves it to the mlab database.

If we head over to [http://localhost:4000**/api/v1/paintings**](http://localhost:4000/api/v1/paintings) we should see an empty array.

Why empty? Well we haven’t added any paintings just yet. Let’s do that now!

Install postman, it’s available for all platforms.

After installation, open postman.

  • On the left you can see the method options. Change that to POST
  • Next to the POST method we have the URL. That’s the URL we want to send our method to.
  • On the right you can see blue button which sends the request.
  • Below the URL bar we have the options. Click on the body and fill in the fields like in the example.
{
  "name": "Mona Lisa",
  "url": "https://en.wikipedia.org/wiki/Mona_Lisa#/media/File:Mona_Lisa,_by_Leonardo_da_Vinci,_from_C2RMF_retouched.jpg",
  "techniques": ["Portrait"]
}

POST paintings

Alright. Good to go! Let’s open [http://localhost:4000/api/v1/paintings](http://localhost:4000/api/v1/paintings)

Excellent! We still have some way to go! Next up — GraphQL!


Here’s the source code just in case anyone needs it :-)


#node-js #graphql #mongodb #api #hapi-js

Sofia Kelly

Sofia Kelly

1570765643

Developing Modern APIs with Hapi.js, Node.js, and Redis

In this article, you are going to learn how to develop modern APIs with Hapi.js and Node.js, while using Redis as the persistence layer. As it is not possible to release an API without a security layer, you will also learn how to secure your application with Auth0. If needed, you can find the final code developed throughout this article in this GitHub repository.

What is Hapi.js?

Hapi.js is a framework for creating backend APIs. What is nice about Hapi.js, when compared to other solutions like Express, is the coding-by-configuration architecture. As you will see, most of the “coding” is actually done by tweaking the vast configuration interface that Hapi.js provides to developers. This approach helps to split the common aspects of HTTP from the handler.

What Is Redis and What Will You Build?

Redis is an open-source, in-memory data store that provides an interface so applications can manipulate data based on a key-value approach. As everything in a Redis database is simply a value accessible through a key, fetching data from it is extremely fast. This characteristic of Redis makes this database perfect for applications like to-do lists.

So, in this article, you will use Redis to act as the persistence layer of a backend API that supports a to-do list application. You won’t develop the frontend application in this article, but you will soon, on an upcoming one.

Note: In this article, you are going to use Hapi.js 17. This version has breaking changes from version 16.

What Is Docker and Why Do You Care?

To keep your machine clean, you are not going to install Redis directly on your operating system. Instead, you are going to run Redis inside a Docker container. Docker, if you don’t know, is a solution that enables users to run programs that operate completely isolated from each other. Docker achieves this by containerizing these programs into engines that work similarly to virtual machines.

However, containers are way less expensive (i.e., more lightweight) when compared to traditional virtual machines. For example, you can easily bootstrap a container that uses NGINX in front of a Node.js instance to serve a web app with 16MB or less. Also, Docker uses a file called dockerfile that facilitates the process of sharing containers configuration with others.

In this article, you are going to download and use a pre-built Redis container that allows you to use Redis fresh out of the box, with no setup.

Bootstrapping a Hapi.js API

Your API will contain the main server setup and individual files for each route you will need to define. Basically, you will create a project that contains the following structure:

  • src/: A directory that will hold code related to the server setup.
  • src/routes: A directory where you will define the endpoints of your API.

So, open a terminal, locate the directory where you want to store your project in, and run the following commands:

# e.g., move to your home dir (or anywhere else)
cd ~

# create a directory for your project
mkdir nodejs-hapijs-redis

# move into it
cd nodejs-hapijs-redis

# and create both subdirectories
mkdir -p src/routes

After that, you can initialize your main directory as an NPM project and install some dependencies on it:

# initialize this directory as an NPM project
npm init -y

# install your project's dependencies
npm install --save boom good good-console good-squeeze hapi hapi-auth-jwt2 hapi-require-https inert joi jwks-rsa lout node-env-file redis uuid vision

As you can see, you will need to install a considerable number of dependencies. Throughout this article, you will see how each one fits in. However, the following list gives a brief introduction to them:

  • boom: This is a library that tightly integrates with Hapi.js to throw HTTP-friendly error objects.
  • good: This is a library that you will plug into Hapi.js to monitor and report on a variety of server events.
  • good-console: This library is useful for turning good server events into formatted strings.
  • good-squeeze: This library is useful for filtering events based on the good event options.
  • hapi: This is the main package of Hapi.js itself.
  • hapi-auth-jwt2: This is an authentication scheme/plugin for Hapi.js apps using JSON Web Tokens.
  • hapi-require-https: This is a library that will help you force secure connections (i.e., HTTPS).
  • inert: This is a library that helps you serve static file and directory handlers in your Hapi.js API.
  • joi: This library introduces an object schema description language and a validator for JavaScript objects.
  • jwks-rsa: This library retrieves RSA public keys from a JWKS (JSON Web Key Set) endpoint.
  • lout: This library helps you create the API documentation for your Hapi.js backend.
  • node-env-file: This library parses and loads environment files into a Node.js environment (i.e., into the process.env object).
  • redis: This is a Redis client for Node.js applications.
  • uuid: This library generates RFC-compliant UUIDs in JavaScript.
  • vision: This library enables templates rendering for Hapi.js.

Now that you know what you just installed, open the package.json file that NPM created for you and replace its scripts property with this:

"scripts": {
  "start": "node index.js"
}

Note: You might also want to start Git (or any other version control system) now and start committing your work. It’s always a good idea to use tools like Git to manage your source code.

Initializing Redis with Docker

As mentioned, you will bootstrap a Redis instance in your local machine with the help of Docker. Therefore, before proceeding you will have to install Docker locally. After installing it, you can test the installation by running the following command:

docker --version

If everything goes fine, you can issue this command to run Redis locally (in a Docker container, of course):

docker run --name nodejs-hapijs-redis \
    -p 6379:6379 \
    -d redis

If this is the first time you are running Redis locally with the help of Docker, this command will output Unable to find image 'redis:latest' locally in your terminal and will start downloading a Redis image from Docker Hub. For this article, you don’t need to learn how Docker works. Issuing the command above suffices for you to move along. However, after you finish with this article, make sure you learn more about Docker. The tool is amazing.

Signing Up to Auth0

To start with a secure backend from scratch, you will sign up for a free Auth0 account now (i.e., if you don’t have one yet) and you will configure your project to use this identity provider.

If you don’t know, Auth0 is a global leader in Identity-as-a-Service (IDaaS) that provides thousands of enterprise customers with modern identity solutions. Alongside with the classic username and password authentication process, Auth0 allows you to add features like Social Login, Multi-factor Authentication, and much more with just a few clicks.

So, after you sign up for Auth0, you can head to the APIs section of your dashboard and click on Create API. Then, on the dialog that Auth0 shows, you will have to provide a Name for your API (e.g., “Hapi.js Tutorial”) and an Identifier (e.g., http://localhost:3000). The name of your API is just a label so you can easily remember what the API is about. The identifier is a string that you will use while configuring your backend. This identifier doesn’t really have to be an URL, as Auth0 won’t call it in any moment, but it’s advised to use one.

Creating a new Auth0 API for your Hapi.js backend.

After filling out the form, click on Create so Auth0 finishes the creation for you.

Creating an environment file

As you will have the configuration for your Auth0 account, you will keep it in a separate file so you can easily switch between a production and testing environment. As such, create a file called .env in your project root and put the following contents in it:

AUTH0_AUDIENCE=http://localhost:3000
AUTH0_DOMAIN=<YOUR_AUTH0_DOMAIN>
HOST=localhost
PORT=3000
REDIS_HOST=localhost
REDIS_PORT=6379
SSL=false

Replace <YOUR_AUTH0_DOMAIN> with the domain you chose while creating your Auth0 account (e.g., blog-samples.auth0.com). The other configuration variables will work in your local environment, unless you chose another identifier for your API. If that is the case, you will have to set the correct value to the AUTH0_AUDIENCE variable.

Note: The SSL variable above defines if your API will accept only requests through a secure channel (i.e., HTTPS) or not. This variable will be used by the hapi-require-https library that you installed before.

Creating the Hapi.js Server

With the environment variables properly defined, you will have to create a script to start your Hapi.js server. To do so, create a file called index.js in the project root (i.e., in the nodejs-hapijs-redis directory) and add the following code into it:

require('node-env-file')(`${__dirname}/.env`);

const redis = require('redis');
const createServer = require('./src/server');
const {promisify} = require('util');

const start = async () => {
  const server = await createServer(
    {
      port: process.env.PORT,
      host: process.env.HOST,
    },
    {
      enableSSL: process.env.SSL === 'true',
    }
  );

  const redisClient = redis.createClient(
    {
      host: process.env.REDIS_HOST,
      port: process.env.REDIS_PORT,
    }
  );

  redisClient.lpushAsync = promisify(redisClient.lpush).bind(redisClient);
  redisClient.lrangeAsync = promisify(redisClient.lrange).bind(redisClient);
  redisClient.llenAsync = promisify(redisClient.llen).bind(redisClient);
  redisClient.lremAsync = promisify(redisClient.lrem).bind(redisClient);
  redisClient.lsetAsync = promisify(redisClient.lset).bind(redisClient);

  redisClient.on("error", function (err) {
    console.error("Redis error.", err);
  });

  server.app.redis = redisClient;

  await server.start();

  console.log(`Server running at: ${server.info.uri}`);
  console.log(`Server docs running at: ${server.info.uri}/docs`);
};

process.on('unhandledRejection', (err) => {
  console.error(err);
  process.exit(1);
});

start();

As you can see, the first thing your script does is to load the environment variables you just defined. Then, it uses a function called createServer to, well, create a server. After that, the script creates a client to Redis and uses the promisify function provided by Node.js to make the functions provided by the client return JavaScript Promises (using promises, and the new async/await syntax, will make your life much easier). Also, you bind the Redis object to server.app.redis so you have access to it in the routes to store and retrieve data.

Perhaps you didn’t realize (or perhaps you did), but the createServer function used in the script above doesn’t exist yet. This function, as stated on line #4, is expected to be defined on a module called server in the src directory.

Therefore, you can create the src/server.js file and add the following code to it:

const Hapi = require('hapi');
const jwksRsa = require('jwks-rsa');

const validateFunc = async (decoded) => {
  return {
    isValid: true,
    credentials: decoded,
  };
};

module.exports = async (serverOptions, options) => {
  const server = Hapi.server(
    Object.assign({
      port: 3001,
      host: 'localhost',
      routes: {
        cors: {
          origin: ['*'],
        },
      },
    }, serverOptions),
  );

  // Redirect to SSL
  if (options.enableSSL) {
    console.log('Setting SSL');
    await server.register({plugin: require('hapi-require-https')});
  } else {
    console.log('Not setting SSL');
  }

  await server.register([
    require('vision'),
    require('inert'),
    {
      plugin: require('lout'),
      options: {
        endpoint: '/docs',
      },
    },
    {
      plugin: require('good'),
      options: {
        ops: {
          interval: 1000,
        },
        reporters: {
          consoleReporter: [
            {
              module: 'good-squeeze',
              name: 'Squeeze',
              args: [{response: '*'}],
            },
            {
              module: 'good-console',
            },
            'stdout',
          ],
        },
      },
    },
  ]);

  await server.register(require('hapi-auth-jwt2'));

  server.auth.strategy('jwt', 'jwt', {
    complete: true,
    key: jwksRsa.hapiJwt2KeyAsync({
      cache: true,
      rateLimit: true,
      jwksRequestsPerMinute: 5,
      jwksUri: `https://${process.env.AUTH0_DOMAIN}/.well-known/jwks.json`,
    }),
    verifyOptions: {
      audience: process.env.AUTH0_AUDIENCE,
      issuer: `https://${process.env.AUTH0_DOMAIN}/`,
      algorithms: ['RS256'],
    },
    validate: validateFunc,
  });

  server.auth.default('jwt');

  server.route(require('./routes.js'));

  return server;
};

The main export from this code is a function that creates and returns a valid Hapi.js server. This function starts by accepting arguments from the index.js file and by creating the server. Then, it provide some default configurations like port and host to make sure that everything goes fine if the caller doesn’t specify these variables, but soon it replaces them with the ones provided by the caller (if any).

After creating the Hapi.js server (Hapi.server()), this script decides, based on the configuration passed, if it is going to use SSL or not. Then, the script configures the plugins you installed before (e.g., vision, inert, and lout) in your Hapi.js server.

Finally, the script secures the server by using the jwt strategy (server.auth.strategy('jwt', ...)) and by making it the default authentication method (server.auth.default('jwt')).

The function validateFunc (defined at the top of the script) is given users’ credentials and returns an object telling Hapi.js whether these users have access to the current resource or not. In this simple example, you allow all users access if they have a valid token, but you can be more restrictive by refactoring this function.

The last thing this script does, besides returning an instance of the Hapi.js file, is to define that it will load the endpoint (also known as routes) from a module called routes. You will define this module in the next section.

Defining Routes on Hapi.js

Now, it is time to learn how to define endpoints (i.e., routes) in your Hapi.js server. In the server module, you called the server.route function, which accepts an array of routes for your server. As such, you could simply define these routes directly into the server module. However, to make the code more readable and organized, you will put each route in a different file.

To do so, create a file called src/routes.js and copy the following into it:

module.exports = [

  './routes/todo_get',
  './routes/todo_post',
  './routes/todo_delete',

].map((elem) => require(elem));

This code maps over each filename and returns an array of imported routes. As you can imagine, you still have to define these files and routes.

Defining a Route to Post new Items

For your first route, you will create an endpoint that enables users to add new items to their to-do lists. To do so, make a file called src/routes/todo_post.js with the following contents:

const Joi = require('joi');
const Boom = require('boom');

module.exports = {
  method: 'POST',
  path: '/todo',
  options: {
    auth: 'jwt',
    validate: {
      payload: {
        item: Joi.string().required().notes('Text to store in list')
      },
    },
    description: 'Add item',
    notes: 'Add an item to the list',
    tags: ['api'],
  },
  handler: async (request, h) => {
    let {sub: redispath} = request.auth.credentials;
    let {item: redisvalue} = request.payload;
    let {redis} = request.server.app;

    try {

      let count = await redis.lpushAsync(redispath, redisvalue);

      return h.response({
        count
      }).code(201);

    } catch (e) {
      return Boom.badImplementation(e);
    }
  }
};

The export from this file is a JSON object that represents a route for Hapi.js. The method and path properties tell Hapi.js what HTTP method and what route is required to call the handler code. In the options, you specify jwt as the authentication required to access this route. The description, notes, and tags document the route for others using it.

The validate object is an extremely useful courtesy of the joi library. This allows you to specify what inputs are required for the route and, if not met, Hapi.js will automatically throw an error for you. All that is required for this route is an item that comes as the payload of requests. This item must be a string and is required (string().required()).

Finally, the handler runs your route and returns a value to Hapi.js. You use the JWT subject as the key for the Redis key-value pair, and the value of this key is the string sent by the user. You use the new promisified Redis functions to add the item to Redis, and you return the number of items in the array (with a 201 response code).

If anything goes wrong, your Hapi.js server will send an HTTP error code back using the Boom library.

Defining a Route to Delete Items

To allow users to delete items, create a file called src/routes/todo_delete.js with the following contents:

const Joi = require('joi');
const Boom = require('boom');

module.exports = {
  method: 'DELETE',
  path: '/todo',
  options: {
    auth: 'jwt',
    validate: {
      payload: {
        index: Joi.number().min(0).required().notes('Index to delete'),
      },
    },
    description: 'Delete item',
    notes: 'Delete an item from the todo list',
    tags: ['api'],
  },
  handler: async (request, h) => {
    let {sub: redispath} = request.auth.credentials;
    let {index: redisindex} = request.payload;
    let {redis} = request.server.app;

    try {
      await redis.lsetAsync(redispath, redisindex, '__DELETE__');
      await redis.lremAsync(redispath, 1, '__DELETE__');

      return h.response({}).code(200);
    } catch (e) {
      return Boom.badImplementation(e);
    }
  }
};

The route is very similar to the POST route. You define the endpoint as an HTTP DELETE route with a required index parameter to delete a value from Redis. To delete the item from Redis by index, you first overwrite the value of that entry, and then delete entries with that new value.

What is Hypertext Application Language (HAL)?

When you define your final route for retrieving the todo items, you will borrow some features from the HAL specification. This spec is designed to make it easy to traverse APIs without having to guess endpoints.

For your case, you will page the results when retrieving items, so you will include a link to the next page of results in the response. This way, the client applications that use your API won’t have to generate the links themselves.

Defining a Route to Get All Items

Finally, to define an endpoint where users will be able to get all their to-do items, create a file called src/routes/todo_get.js with the following contents:

const Joi = require('joi');
const Boom = require('boom');

module.exports = {
  method: 'GET',
  path: '/todo',
  options: {
    auth: 'jwt',
    validate: {
      query: {
        start: Joi.number().min(0).default(0).notes('Start index of results inclusive'),
        results: Joi.number().min(1).max(100).default(10).notes('Number of results to return'),
      },
    },
    description: 'Get items',
    notes: 'Get items from todo list paged',
    tags: ['api'],
  },
  handler: async (request, h) => {
    let {redis} = request.server.app;
    let {sub: redispath} = request.auth.credentials;
    let {start, results} = request.query;

    try {
      let value = await redis.lrangeAsync(redispath, start, start + (results - 1));
      let count = await redis.llenAsync(redispath);

      if (!value) value = [];

      return h.response({
        nextlink: `${request.url.pathname}?start=${start + results}&results=${results}`,
        value,
        count
      });
    } catch (e) {
      return Boom.badImplementation(e);
    }
  }
};

This module (or file) defines a GET HTTP route with two optional query string parameters (with default values set). By using these parameters, your client can specify the first element (start index) and the number of results they need. Note that this script gets the results from Redis and also the total number of results. This information is important so the client can display how many items the user has.

In the response, you add a nextlink property with the API URL to call for the next set of results.

Running and Using your Hapi.js API

That’s it! You just finished creating your Node.js backend API with the help of Hapi.js and Redis. With all these files in place, you can take your API for a spin. To do so, issue the following command on the terminal (just make sure you are in the correct directory: nodejs-hapijs-redis):

npm start

Then, if you go to the /docs resource, you will see the documentation of your Hapi.js API:

Hapi.js showing the documentation of the endpoints.

Now, to test if your endpoints are really secured, you can issue the following curl commands:

curl http://localhost:3000/todo

curl -X POST -H 'Content-Type: application/json' -d '{
  "item": "It should not work."
}' http://localhost:3000/todo

Both commands above should return the following response:

{
  "statusCode": 401,
  "error": "Unauthorized",
  "message": "Missing authentication"
}

That is, your server is telling you that it is expecting you to be authenticated somehow. The server doesn’t specify that it’s expecting an access token from Auth0 because you shouldn’t be adding details like that about your services. However, you know that this is what you need.

So, there are multiple ways to fetch a token from Auth0. The strategy that you will use will depend on what context you are in. For example, if you are on a Single Page Application (SPA), you will use what is called the Implicit Grant. If you are on a native, mobile application, you will use the Authorization Code Grant Flow with PKCE. However, for a simple test like this one, you can use your Auth0 dashboard to get one.

So, head back to the APIs section in your Auth0 dashboard, click on the API you created before, and then click on the Test section of this API. There, you will find a button called Copy Token. Click on this button to copy an access token to your clipboard.

Copying a test token from the Auth0 dashboard.

Then, with this token in your clipboard, go back to your terminal and execute the following commands:

# set a variable with your access token
ACCESS_TOKEN=<YOUR_ACCESS_TOKEN>

# use the token to insert an item
curl -X POST -H 'Content-Type: application/json' \
  -H 'Authorization: Bearer '$ACCESS_TOKEN -d '{
  "item": "Learn about more about Docker, Auth0, and Redis."
}' http://localhost:3000/todo

Note: You will have to replace <YOUR_ACCESS_TOKEN> with the token copied from Auth0.

The second command, the one that issues an HTTP request with your token, will create a new item in your to-do list so you can remember that you have to “learn about more about Docker, Auth0, and Redis.” As the response to this request, your API will send this to you:

{
  "count": 1
}

This answer tells you that you have a single record on your to-do list right now, as you would expect. Now, to see this item, you can issue the following command:

# in the same terminal because you need $ACCESS_TOKEN
curl -H 'Authorization: Bearer '$ACCESS_TOKEN http://localhost:3000/todo

This command will output the following response from the Hapi.js API:

{
  "nextlink": "/todo?start=10&results=10",
  "value": ["Learn about more about Docker, Auth0, and Redis."],
  "count": 1
}

As you can see, your to-do item was properly inserted. Now, to remove this item, you can issue this command:

curl -X DELETE -H 'Content-Type: application/json' \
  -H 'Authorization: Bearer '$ACCESS_TOKEN -d '{
  "index": 0
}' http://localhost:3000/todo

In this case, you are issuing a DELETE request with index equals 0 so your API remove the first element from your to-do list. Cool, you just used your API for the first time!

Conclusion and Next Steps

In this article, you learned how to create modern APIs with Hapi.js, Node.js, and Redis. Also, you learned how to integrate your API with Auth0 to take advantage of the state-of-the-art security provided by this company. All of that, without struggling too much.

However, you wouldn’t expect end-users to use a REST API directly through the command-line interface or through generic HTTP clients like Postman, would you? As such, in the next article, you will learn how to create a Single Page Application to interact with your API. To create this application, you will use a modern approach based on web components and LitElement. Stay tuned!

30s ad

Getting Started with NodeJS for Beginners

Learn Nodejs by building 12 projects

Supreme NodeJS Course - For Beginners

NodeJS & MEAN Stack - for Beginners - In Easy way!

Node.js for beginners, 10 developed projects, 100% practical

#node-js #javascript #hapi-js

Developing Modern APIs with Hapi.js, Node.js, and Redis