PostgreSQL with Nodejs

Most of the applications need at some point to persist data. It can be through files, local storage, cloud services or often databases

If you do not have particular requirement relational database systems are usually a good default choice, and particularly PostgreSQL, a very powerful open source SQL server. Moreover, if you develop for the Nodejs platform you will not run short of tools and solutions.

So what would be the ideal library or tool for your next application?

If you are lucky enough to work for a company which has resources to have its own dedicated database team, you will probably have only to use the abstractions they provide in your language of choice (Javascript in our case) to call the stored procedures they will have carefully crafted. In that case, you will eventually see no SQL (Structured Query Language, the language used in many relational database systems) at all and won’t go out of your comfort zone. In practice, this case may not occur very often and the truth is that many Nodejs developers or even so-called full stack developers know very little about SQL and will seek for various abstractions in their favorite packages repository.

Lately, I came across a very interesting article written by Thomas Hunter which were describing the different levels of abstraction a database related library may have. The idea was to dissuade you to use frameworks called ORM (Object Relational Mapping) and to convince you to learn SQL. Of course, as often in software development, each kind of abstraction has its pros and cons and let’s review quickly what I learned in Thomas’ article.

If you want more details you should probably read Thomas’ article and skip the sections below.

Drivers

Drivers provide almost no abstraction at all. They usually only handle connections and the communication protocol with the database. They may also parse the responses into adapted data structures (most of the time in Javascript it will be JSON objects and arrays). The *de facto *PostgreSQL driver for Nodejs is pg.

Drivers are normally very fast and the most flexible solution as they will let you manually write SQL queries yourself. On the other hand, it might quickly become tedious and dangerous to write dynamic queries by hand as it would involve a lot of string manipulation.

Query builders

With query builders, instead of manipulating strings, you will be using functions whose API is usually close to the SQL grammar. In Javascript, functions compose very well with patterns such as partial application, decoration, composition, recursion, higher-order function, etc.

They will often let you create your own abstractions and solve many problems in an elegant way. They also provide a framework with more explicit constraints than strings can do (especially if you use some sort of types on top of the normal signatures) which usually makes you avoid making mistakes and fit better in your everyday tools (IDE and so).

A very popular query builders library is knex.

ORM

ORM is the highest level of abstraction and such frameworks come with usually a lot of features. The promise is to create a data model domain with idioms you know such classes, objects, etc; and then the framework will translate it into a database structure and convenient ways to query it through the models you will have defined. The obvious advantage is that you can handle everything in your language of choice and with concepts familiar to you, the underlying database entities and queries will stay very far from your application code. However, the way we think the application data model sometimes does not quite fit the way we would build a database for that model: it is known as the object-relational impedance mismatch. Such libraries have been creating passionate debates between their fans and the others. Of course, they definitely have their usage and the problem is not black or white, but there are recurrent criticisms:

  • When you learn an ORM (and this is already many hours of work) you learn syntax and concepts you probably will not be used outside of this particular ORM.
  • The SQL queries generated by an ORM may not be very efficient in the sense that ORM will have to tackle specialized queries in a more generalist way. Moreover, performances can also degrade as the ORM will often not return raw data but some model instances of it (in order to provide some features like validation, lazy loading, etc). You often trade performances for features.
  • ORM cannot generate some (basic sometimes) queries or have it wrong when used through their models. This problem is however easily worked around as they provide a low-level query interface to write the queries you want directly with the underlying driver.
  • You may not design the database, the database schema can be generated for you which may sometimes lead to very poor database design.

The most popular ORM for Nodejs is Sequelize.

I think we can do better…

Introducing Ship-Hold

Note all the examples will use the database generated by the script in annexes at the end of this article
In the 1970s (yes almost 50 years ago!). Edgar F. Codd created a Relational Algebra. SQL became the most common implementation of the algebra and has evolved a lot to become one of the most powerful query language and one of the most battle-tested. It includes now procedure capabilities and extensions (especially with PostgreSQL) which go beyond the scope of the algebra. Clearly, SQL is often neglected by application developers whereas its capabilities are enormous.

Through products (joins) you can cross data from different tables and organize your result data in infinite ways whereas with SQL functions you can aggregate data to format it the way you want. A single query although quite verbose is probably way more flexible than what you could do with Javascript even by composing small functions together.

You want to fetch all the posts including a field *author *with the first name and the last name of the author:

SELECT 
    “posts”.*, 
    concat_ws(‘ ‘,”users”.”first_name”,”users”.”last_name”) as “author” 
FROM “posts” JOIN “users” USING(“user_id”);

Actually, you prefer to include all author’s data as a JSON object:

SELECT 
    “posts”.*, 
    to_json(“users”.*) as “author” 
FROM “posts” JOIN “users” USING(“user_id”);

Or you want to know who are the five users who have published the most comments:

WITH 
    “commenters” AS (SELECT COUNT(*),”user_id” FROM “comments” GROUP BY “user_id” ORDER BY “count” DESC LIMIT 5)
SELECT “users”.* FROM “commenters” JOIN “users” USING (“user_id”);

Doing the same things in Javascript considering your table as arrays would require a great deal of code.

You are convinced and decide to learn SQL: great! What if you could write in Javascript with function calls, SQL queries the way you would write them in pure SQL? Good news, with ship-hold you can.

Let’s say you want to find all the posts about SQL published after 2010 including their author’s data ordered by publication date:

const posts = await sh
    .select('posts.*',{
        value: toJson('"users".*'),
        as:'author'
    })
    .from('posts')
    .where('title','ilike','%sql%')
    .and('published_at','>',new Date(2010,1,1))
    .join('users')
    .on('posts.user_id','"users"."user_id"')
    .orderBy('published_at','desc')
    .run();

And ship hold query builders are regular functions so you can add all the syntactic sugar you want. Even better you can compose builders together as you would do with SQL subqueries. For example, you could get the same result for the third example with the following code:

// A higher order function
const fromTable = table => (...args) => sh.select(...args).from(table);

const fromPosts = fromTable('posts');

const fromUsers = fromTable('users');
// A sub query
const getTopFiveCommenters = {
    value: fromPosts(count('*'), 'user_id')
        .groupBy('user_id')
        .orderBy('count', 'desc')
        .limit(5),
    as: 'talkative'
};

const users = await fromUsers()
    .where('user_id', 'IN', sh.select('user_id').from(getTopFiveCommenters))
    .run();

It basically has the power of SQL with the expressiveness of Javascript functions!

Eager loading: a not so easy problem.

SQL is very powerful but at some point, it becomes quite complicated for a human being to write a query to get the expected result. It is particularly true when you want to fetch related data from different tables and start to apply pagination to your result set or to a sub-result set. That’s one of the reasons ORM exist: they eventually help you to generate such queries.

Let’s say you want to get users with the posts they have written in case they have written some:

SELECT * FROM “users” LEFT JOIN “posts” USING (“user_id”)

Ok, now you want the users whose last name is Renard, and you want only the five first Renard ordered by alphanumeric order on their first name. Of course you still want the articles they eventually have written too.

SELECT * FROM “users” LEFT JOIN “posts” USING (“user_id”) WHERE “last_name” = ‘Renard’ ORDER BY “first_name” LIMIT 5

Is it correct ? Actually no: The limit will be applied to the join and as various rows may correspond to the same user (one for each written article) with this query you’ll be missing data.

SELECT * FROM (SELECT * FROM “users” WHERE “last_name”=’Renard’ ORDER BY “first_name” LIMIT 5) as “users” LEFT JOIN “posts” USING (“user_id”) ORDER BY “first_name”

This query would do the job, although the data parsed by the driver would then need to be aggregated: again various rows may correspond to the same user!

Now you also want to paginate the posts included: instead of getting all the articles written by a given user you want the three last published about Javascript. Ah, and you also want to include within these articles the five last published comment with their author’s data…

Ok, you get the point, it becomes quite complicated to write a query which will get you the right result.

Thankfully ORM and such will help you to get the right result. How they do may vary from one library to another. Some will make multiple queries then buffer the result and aggregate it programmatically. Others will inline the join queries, etc. A different strategy can get you the result but more or less efficiently.

Ship hold is not an ORM but it gives you the ability to define services related to a particular table. By using a service, it will simplify the syntax to make a query against a particular table, but it also gives you the ability to define relations between services. It will greatly help you then to write queries with inclusions. You basically only have to compose builders as you would construct your sentence in English.

const Users = sh.service({
    table: 'users',
    primaryKey: 'user_id'
});

const Posts = sh.service({
    table: 'posts',
    primaryKey: 'post_id'
});

const Comments = sh.service({
    table: 'comments',
    primaryKey: 'comment_id'
});
Users.hasMany(Posts);
Posts.belongsTo(Users, 'user_id', 'author');

Users.hasMany(Comments);
Posts.hasMany(Comments);
Comments.belongsTo(Users, 'user_id', 'author');
Comments.belongsTo(Posts, 'post_id', 'article');

Then get the result for our very complicated request aforementioned:

// Five last comments with their authors' data
const fiveLastPublishedComments = Comments
    .select()
    .orderBy('published_at', 'desc')
    .limit(5)
    .include(Users);
// Three last Javascript posts
const threeLastPostAboutJavascript = Posts
    .select()
    .where('title', 'ilike', '%javascript%')
    .orderBy('published_at', 'desc')
    .limit(3)
    .include(fiveLastPublishedComments);

// finally
const renards = await Users
    .select()
    .where('last_name', 'Renard')
    .orderBy('first_name')
    .limit(5)
    .include(threeLastPostAboutJavascript)
    .run();

And done!

I dare you to do it so easily with an other tool :)

About performances

You might wonder how good the queries generated will perform. It is very difficult to compare data access libraries together as the result might change quite a lot depending on the database itself: the indexes created, the amount of data, etc.

However, ship-hold tries to leverage as much as possible the database features where a lot of ORM will duplicate the effort in their Javascript code base. A good example is the aggregation. Where many ORM will perform several queries buffer the result (often with redundant data) then aggregate the result programmatically, ship-hold will ask the database to do the aggregation with its built-in mechanisms. It is also one of the reasons ship-hold is small compared to other libraries. The weight of the installed bundled is 771kbincluding pg driver and others dependencies.

Database makers strive to improve performances, and write usually faster code in C++, with better algorithms that I personally can do on the JS side.

Any performance improvement on the database engine would, therefore, improve ship-hold. Moreover, join queries and subqueries carry a part of redundant data. If you do the aggregation at the database level, I suspect that less data would need to go throw the wires which might improve the performances compared to a solution where data is buffered on the JS side and cleaned/aggregated. It could be especially true if your database is on a remote server.

Nevertheless, you can have a look at the numbers of a sample applicationimplemented with various popular libraries.

If you think ship-hold worth it, please share the project link and star the repository or clap this article. If you have any recommendations to improve it don’t hesitate to contribute. Thanks for reading.

Annexes

Database script

DROP TABLE IF EXISTS posts, users, comments, tags, posts_tags CASCADE;
CREATE TABLE users (
    user_id serial PRIMARY KEY,
    email VARCHAR(100) UNIQUE NOT NULL,
    biography TEXT,
    first_name VARCHAR(100) NOT NULL, 
    last_name VARCHAR(100) NOT NULL 
);

CREATE TABLE posts(
    post_id serial PRIMARY KEY,
    title VARCHAR NOT NULL,
    content TEXT,
    published_at TIMESTAMP,
    user_id integer REFERENCES users (user_id)
);

CREATE TABLE comments(
    comment_id serial PRIMARY KEY,
    content TEXT,
    published_at TIMESTAMP,
    user_id integer REFERENCES users (user_id),
    post_id integer REFERENCES posts (post_id)
);

CREATE TABLE tags(
    tag VARCHAR PRIMARY KEY,
    description TEXT
);

CREATE TABLE posts_tags(
    post_id INTEGER REFERENCES posts,
    tag VARCHAR REFERENCES tags,
    PRIMARY KEY (post_id, tag)
);

By : Laurent Renard

Angular 9 Tutorial: Learn to Build a CRUD Angular App Quickly

What's new in Bootstrap 5 and when Bootstrap 5 release date?

What’s new in HTML6

How to Build Progressive Web Apps (PWA) using Angular 9

What is new features in Javascript ES2020 ECMAScript 2020

Secure Node.js, Express.js and PostgreSQL API using Passport.js

Secure Node.js, Express.js and PostgreSQL API using Passport.js

The comprehensive step by step tutorial on building secure Node.js, Express.js, Passport.js, and PostgreSQL Restful Web Service

The comprehensive step by step tutorial on building secure Node.js, Express.js, Passport.js, and PostgreSQL Restful Web Service. Previously, we have shown you a combination of Node.js, Express.js, and PostgreSQL tutorial. Now, we just add a security for that RESTful Web Service endpoints. Of course, we will start this tutorial from scratch or from zero application. We will use JWT for this Node.js, Express.js, Passport.js, and PostgreSQL tutorial.

Table of Contents:

The following tools, frameworks, and modules are required for this tutorial:

We assume that you have installed PostgreSQL server in your machine or can use your own remote server (we are using PostgreSQL 9.5.13). Also, you have installed Node.js in your machine and can run node, npm or yarn command in your terminal or command line. Next, check their version by type this commands in your terminal or command line.

node -v
v8.12.0
npm -v
6.4.1
yarn -v
1.10.1

That the versions that we are uses. Let’s continue with the main steps.

1. Create Express.js Project and Install Required Modules

Open your terminal or node command line the go to your projects folder. First, install express generator using this command.

sudo npm install express-generator -g

Next, create an Express.js app using this command.

express secure-node --view=ejs

This will create Express.js project with the EJS view instead of Jade view template because using ‘–view=ejs’ parameter. Next, go to the newly created project folder then install node modules.

cd secure-node && npm install

You should see the folder structure like this.

There’s no view yet using the latest Express generator. We don’t need it because we will create a RESTful API.

2. Add and Configure Sequelize.js Module and Dependencies

Before installing the modules for this project, first, install Sequelize-CLI by type this command.

sudo npm install -g sequelize-cli

To install Sequelize.js module, type this command.

npm install --save sequelize

Then install the module for PostgreSQL.

npm install --save pg pg-hstore

Next, create a new file at the root of the project folder.

touch .sequelizerc

Open and edit that file then add this lines of codes.

const path = require('path');

module.exports = {
  "config": path.resolve('./config', 'config.json'),
  "models-path": path.resolve('./models'),
  "seeders-path": path.resolve('./seeders'),
  "migrations-path": path.resolve('./migrations')
};

That files will tell Sequelize initialization to generate config, models, seeders and migrations files to specific directories. Next, type this command to initialize the Sequelize.

sequelize init

That command will create config/config.json, models/index.js, migrations and seeders directories and files. Next, open and edit config/config.json then make it like this.

{
  "development": {
    "username": "djamware",
    "password": "[email protected]@r3",
    "database": "secure_node",
    "host": "127.0.0.1",
    "dialect": "postgres"
  },
  "test": {
    "username": "root",
    "password": "[email protected]@r3",
    "database": "secure_node",
    "host": "127.0.0.1",
    "dialect": "postgres"
  },
  "production": {
    "username": "root",
    "password": "[email protected]@r3",
    "database": "secure_node",
    "host": "127.0.0.1",
    "dialect": "postgres"
  }
}

We use the same configuration for all the environment because we are using the same machine, server, and database for this tutorial.

Before run and test connection, make sure you have created a database as described in the above configuration. You can use the psql command to create a user and database.

psql postgres --u postgres

Next, type this command for creating a new user with password then give access for creating the database.

postgres-# CREATE ROLE djamware WITH LOGIN PASSWORD '[email protected]@r3';
postgres-# ALTER ROLE djamware CREATEDB;

Quit psql then log in again using the new user that previously created.

postgres-# \q
psql postgres -U djamware

Enter the password, then you will enter this psql console.

psql (9.5.13)
Type "help" for help.

postgres=>

Type this command to creating a new database.

postgres=> CREATE DATABASE secure_node;

Then give that new user privileges to the new database then quit the psql.

postgres=> GRANT ALL PRIVILEGES ON DATABASE secure_node TO djamware;
postgres=> \q

3. Create or Generate Models and Migrations

We will use Sequelize-CLI to generating a new model. Type this command to create a model for Products and User model for authentication.

sequelize model:create --name Product --attributes prod_name:string,prod_desc:string,prod_price:float
sequelize model:create --name User --attributes username:string,password:string

That command creates a model file to the model’s folder and a migration file to folder migrations. Next, modify models/user.js and then import this module.

var bcrypt = require('bcrypt-nodejs');

Add the new methods to the User model, so the user.js class will be like this.

module.exports = (sequelize, DataTypes) => {
  const User = sequelize.define('User', {
    username: DataTypes.STRING,
    password: DataTypes.STRING
  }, {});
  User.beforeSave((user, options) => {
    if (user.changed('password')) {
      user.password = bcrypt.hashSync(user.password, bcrypt.genSaltSync(10), null);
    }
  });
  User.prototype.comparePassword = function (passw, cb) {
    bcrypt.compare(passw, this.password, function (err, isMatch) {
        if (err) {
            return cb(err);
        }
        cb(null, isMatch);
    });
  };
  User.associate = function(models) {
    // associations can be defined here
  };
  return User;
};

For the models/product.js there’s no action needed, leave it as default generated the model class.

4. Create Routers for RESTful Web Service and Authentication

To authenticating users and secure the resources or endpoint create this file as a router.

touch routes/api.js

Open and edit routes/api.js then declares all require variables.

const express = require('express');
const jwt = require('jsonwebtoken');
const passport = require('passport');
const router = express.Router();
require('../config/passport')(passport);
const Product = require('../models').Product;
const User = require('../models').User;

Create a router for signup or register the new user.

router.post('/signup', function(req, res) {
  console.log(req.body);
  if (!req.body.username || !req.body.password) {
    res.status(400).send({msg: 'Please pass username and password.'})
  } else {
    User
      .create({
        username: req.body.username,
        password: req.body.password
      })
      .then((user) => res.status(201).send(user))
      .catch((error) => {
        console.log(error);
        res.status(400).send(error);
      });
  }
});

Create a router for sign in or login with username and password.

router.post('/signin', function(req, res) {
  User
      .find({
        where: {
          username: req.body.username
        }
      })
      .then((user) => {
        if (!user) {
          return res.status(401).send({
            message: 'Authentication failed. User not found.',
          });
        }
        user.comparePassword(req.body.password, (err, isMatch) => {
          if(isMatch && !err) {
            var token = jwt.sign(JSON.parse(JSON.stringify(user)), 'nodeauthsecret', {expiresIn: 86400 * 30});
            jwt.verify(token, 'nodeauthsecret', function(err, data){
              console.log(err, data);
            })
            res.json({success: true, token: 'JWT ' + token});
          } else {
            res.status(401).send({success: false, msg: 'Authentication failed. Wrong password.'});
          }
        })
      })
      .catch((error) => res.status(400).send(error));
});

Create a secure router to get and post product data.

router.get('/product', passport.authenticate('jwt', { session: false}), function(req, res) {
  var token = getToken(req.headers);
  if (token) {
    Product
      .findAll()
      .then((products) => res.status(200).send(products))
      .catch((error) => { res.status(400).send(error); });
  } else {
    return res.status(403).send({success: false, msg: 'Unauthorized.'});
  }
});

router.post('/product', passport.authenticate('jwt', { session: false}), function(req, res) {
  var token = getToken(req.headers);
  if (token) {
    Product
      .create({
        prod_name: req.body.prod_name,
        prod_desc: req.body.prod_desc,
        prod_price: req.body.prod_price
      })
      .then((product) => res.status(201).send(product))
      .catch((error) => res.status(400).send(error));
  } else {
    return res.status(403).send({success: false, msg: 'Unauthorized.'});
  }
});

Create a function for extract the token.

getToken = function (headers) {
  if (headers && headers.authorization) {
    var parted = headers.authorization.split(' ');
    if (parted.length === 2) {
      return parted[1];
    } else {
      return null;
    }
  } else {
    return null;
  }
};

Finally, export the router as a module.

module.exports = router;

5. Run and Test Secure Node.js, Express.js, Passport.js, and PostgreSQL Web Service

To run and test this secure Node.js, Express.js, Passport.js, and PostgreSQL Web Service, run the PostgreSQL instance first then run this command from the Terminal.

nodemon

or

npm start

To test the secure Product endpoint, open the Postman then type fill all required fields like this image.

You should get the response message Unauthorized and status code 401. Next, test signup using the Postman by changing the method to POST, add the address localhost:3000/api/signup, add the header Content-type with value application/json and the body of request raw text like this.

{ "username":"[email protected]", "password":"qqqq1111" }

You should get this response when executing successfully.

Next, test to log in with the above signed/registered username and password by changing the URL to localhost:3000/api/signin. You should get this response when executes successfully.

Now, you can back using the previous GET method with additional header using the token get from the sign in/log in response. You should see the Product data like below.

That it’s, the secure Node.js, Express.js, Passport.js, and PostgreSQL Web Service. You can get the working source code from our GitHub.

Learn More

The Complete Node.js Developer Course (2nd Edition)

Learn and Understand NodeJS

Node JS: Advanced Concepts

GraphQL: Learning GraphQL with Node.Js

Angular (Angular 2+) & NodeJS - The MEAN Stack Guide

The Complete Python & PostgreSQL Developer Course

SQL & Database Design A-Z™: Learn MS SQL Server + PostgreSQL

The Complete SQL Bootcamp

The Complete Oracle SQL Certification Course

Postgres.js: Fastest full featured PostgreSQL client for Node.js

Postgres.js: Fastest full featured PostgreSQL client for Node.js

Postgres.js: Fastest full featured PostgreSQL client for Node.js. Connection options postgres([url], [options]). Query sql` ` -> Promise. Stream sql` `.stream(fn) -> Promise. Dynamic query helpers sql() inside tagged template. File query sql.file(path, [args], [options]) -> Promise.

Getting started

Install

source-shell
$ npm install postgres

Use

source-js
const postgres = require('postgres')

const sql = postgres({ ...options }) // will default to the same as psql

await sql`
  select name, age from users
`
// > [{ name: 'Murray', age: 68 }, { name: 'Walter', age 78 }]
Connection options postgres([url], [options])

You can use either a postgres:// url connection string or the options to define your database connection properties. Options in the object will override any present in the url.

source-js
const sql = postgres('postgres://username:[email protected]:port/database', {
  host        : '',         // Postgres ip address or domain name
  port        : 5432,       // Postgres server port
  path        : '',         // unix socket path (usually '/tmp')
  database    : '',         // Name of database to connect to
  username    : '',         // Username of database user
  password    : '',         // Password of database user
  ssl         : false,      // True, or options for tls.connect
  max         : 10,         // Max number of connections
  timeout     : 0,          // Idle connection timeout in seconds
  types       : [],         // Array of custom types, see more below
  onnotice    : fn          // Defaults to console.log
  onparameter : fn          // (key, value) when server param change
  debug       : fn          // Is called with (connection, query, parameters)
  transform   : {
    column            : fn, // Transforms incoming column names
    value             : fn, // Transforms incoming row values
    row               : fn  // Transforms entire rows
  },
  connection  : {
    application_name  : 'postgres.js', // Default application_name
    ...                                // Other connection parameters
  }
})

More info for the ssl option can be found in the Node.js docs for tls connect options

Query sql -> Promise

A query will always return a Promise which resolves to either an array [...] or null depending on the type of query. Destructuring is great to immidiately access the first element.

source-js
const [new_user] = await sql`
  insert into users (
    name, age
  ) values (
    'Murray', 68
  )

  returning *
`

// new_user = { user_id: 1, name: 'Murray', age: 68 }

Query parameters

Parameters are automatically inferred and handled by Postgres so that SQL injection isn't possible. No special handling is necessarry, simply use JS tagged template literals as usual.

source-js
let search = 'Mur'

const users = await sql`
  select 
    name, 
    age 
  from users
  where 
    name like ${ search + '%' }
`

// users = [{ name: 'Murray', age: 68 }]

Stream sql .stream(fn) -> Promise

If you want to handle rows returned by a query one by one you can use .stream which returns a promise that resolves once there are no more rows.

source-js
await sql`
  select created_at, name from events
`.stream(row => {
  // row = { created_at: '2019-11-22T14:22:00Z', name: 'connected' }
})

// No more rows

Listen and notify

When you call listen, a dedicated connection will automatically be made to ensure that you receive notifications in realtime. This connection will be used for any further calls to listen.

source-js
sql.listen('news', payload => {
  const json = JSON.parse(payload)
  console.log(json.this) // logs 'is'
})

Notify can be done as usual in sql, or by using the sql.notify method.

sql.notify('news', JSON.stringify({ no: 'this', is: 'news' }))

Dynamic query helpers sql() inside tagged template

Postgres.js has a safe, ergonomic way to aid you in writing queries. This makes it easier to write dynamic inserts, selects, updates and where queries.

Insert

source-js
const user = {
  name: 'Murray',
  age: 68
}

sql`
  insert into users ${
    sql(user)
  }
`

Is translated into a safe query like this:

source-sql
insert into users (name, age) values ($1, $2)

Multiple inserts in one query

If you need to insert multiple rows at the same time it's also much faster to do it with a single insert. Simply pass an array of objects to sql().

source-js
const users = [{
  name: 'Murray',
  age: 68,
  garbage: 'ignore'
}, {
  name: 'Walter',
  age: 78
}]

sql`
  insert into users ${
    sql(users, 'name', 'age')
  }
`

Arrays sql.array(Array)

Postgres has a native array type which is similar to js arrays, but Postgres only allows the same type and shape for nested items. This method automatically infers the item type and translates js arrays into Postgres arrays.

source-js
const types = sql`
  insert into types (
    integers,
    strings,
    dates,
    buffers,
    multi
  ) values (
    ${ sql.array([1,2,3,4,5]) },
    ${ sql.array(['Hello', 'Postgres']) },
    ${ sql.array([new Date(), new Date(), new Date()]) },
    ${ sql.array([Buffer.from('Hello'), Buffer.from('Postgres')]) },
    ${ sql.array([[[1,2],[3,4]][[5,6],[7,8]]]) },
  )
`

JSON sql.json(object)

source-js
const body = { hello: 'postgres' }

const [{ json }] = await sql`
  insert into json (
    body
  ) values (
    ${ sql.json(body) }
  )
  returning body
`

// json = { hello: 'postgres' }
File query sql.file(path, [args], [options]) -> Promise

Using an sql file for a query. The contents will be cached in memory so that the file is only read once.

source-js
sql.file(path.join(__dirname, 'query.sql'), [], {
  cache: true // Default true - disable for single shot queries or memory reasons
})

Transactions

BEGIN / COMMIT sql.begin(fn) -> Promise

Calling begin with a function will return a Promise which resolves with the returned value from the function. The function provides a single argument which is sql with a context of the newly created transaction. BEGIN is automatically called, and if the Promise fails ROLLBACK will be called. If it succeeds COMMIT will be called.

source-js
const [user, account] = await sql.begin(async sql => {
  const [user] = await sql`
    insert into users (
      name
    ) values (
      'Alice'
    )
  `

  const [account] = await sql`
    insert into accounts (
      user_id
    ) values (
      ${ user.user_id }
    )
  `

  return [user, account]
})

SAVEPOINT sql.savepoint([name], fn) -> Promise

source-js
sql.begin(async sql => {
  const [user] = await sql`
    insert into users (
      name
    ) values (
      'Alice'
    )
  `

  const [account] = (await sql.savepoint(sql => 
    sql`
      insert into accounts (
        user_id
      ) values (
        ${ user.user_id }
      )
    `
  ).catch(err => {
    // Account could not be created. ROLLBACK SAVEPOINT is called because we caught the rejection.
  })) || []

  return [user, account]
})
.then(([user, account])) => {
  // great success - COMMIT succeeded
})
.catch(() => {
  // not so good - ROLLBACK was called
})

Do note that you can often achieve the same result using WITH queries (Common Table Expressions) instead of using transactions.

Types

You can add ergonomic support for custom types, or simply pass an object with a { type, value } signature that contains the Postgres oid for the type and the correctly serialized value.

Adding Query helpers is the recommended approach which can be done like this:

source-js
const sql = sql({
  types: {
    rect: {
      to        : 1337,
      from      : [1337],
      serialize : ({ x, y, width, height }) => [x, y, width, height],
      parse     : ([x, y, width, height]) => { x, y, width, height }
    }
  }
})

const [custom] = sql`
  insert into rectangles (
    name,
    rect
  ) values (
    'wat',
    ${ sql.types.rect({ x: 13, y: 37: width: 42, height: 80 }) }
  )
  returning *
`

// custom = { name: 'wat', rect: { x: 13, y: 37: width: 42, height: 80 } }

Teardown / Cleanup

To ensure proper teardown and cleanup on server restarts use sql.end({ timeout: null }) before process.exit()

Calling sql.end() will reject new queries and return a Promise which resolves when all queries are finished and the underlying connections are closed. If a timeout is provided any pending queries will be rejected once the timeout is reached and the connections will be destroyed.

Sample shutdown using Prexit

source-js
import prexit from 'prexit'

prexit(async () => {
  await sql.end({ timeout: 5 })
  await new Promise(r => server.close(r))
})

Unsafe queries sql.unsafe(query, [args], [options]) -> promise

If you know what you're doing, you can use unsafe to pass any string you'd like to postgres. Please note that this can lead to sql injection if you're not careful.

source-js
sql.unsafe('select ' + danger + ' from users where id = ' + dragons)

Errors

Errors are all thrown to related queries and never globally. Errors comming from Postgres itself are always in the native Postgres format, and the same goes for any Node.js errors eg. coming from the underlying connection.

There are also the following errors specifically for this library.

MESSAGE_NOT_SUPPORTED

X (X) is not supported

Whenever a message is received from Postgres which is not supported by this library. Feel free to file an issue if you think something is missing.

MAX_PARAMETERS_EXCEEDED

Max number of parameters (65534) exceeded

The postgres protocol doesn't allow more than 65534 (16bit) parameters. If you run into this issue there are various workarounds such as using sql([...]) to escape values instead of passing them as parameters.

SASL_SIGNATURE_MISMATCH

Message type X not supported

When using SASL authentication the server responds with a signature at the end of the authentication flow which needs to match the one on the client. This is to avoid man in the middle attacks. If you receive this error the connection was canceled because the server did not reply with the expected signature.

NOT_TAGGED_CALL

Query not called as a tagged template literal

Making queries has to be done using the sql function as a tagged template. This is to ensure parameters are serialized and passed to Postgres as query parameters with correct types and to avoid SQL injection.

AUTH_TYPE_NOT_IMPLEMENTED

Auth type X not implemented

Postgres supports many different authentication types. This one is not supported.

CONNECTION_CLOSED

write CONNECTION_CLOSED host:port

This error is thrown if the connection was closed without an error. This should not happen during normal operation, so please create an issue if this was unexpected.

CONNECTION_ENDED

write CONNECTION_ENDED host:port

This error is thrown if the user has called sql.end() and performed a query afterwards.

CONNECTION_DESTROYED

write CONNECTION_DESTROYED host:port

This error is thrown for any queries that were pending when the timeout to sql.end({ timeout: X }) was reached.

How to Use Express.js, Node.js and MongoDB.js

How to Use Express.js, Node.js and MongoDB.js

In this post, I will show you how to use Express.js, Node.js and MongoDB.js. We will be creating a very simple Node application, that will allow users to input data that they want to store in a MongoDB database. It will also show all items that have been entered into the database.

In this post, I will show you how to use Express.js, Node.js and MongoDB.js. We will be creating a very simple Node application, that will allow users to input data that they want to store in a MongoDB database. It will also show all items that have been entered into the database.

Creating a Node Application

To get started I would recommend creating a new database that will contain our application. For this demo I am creating a directory called node-demo. After creating the directory you will need to change into that directory.

mkdir node-demo
cd node-demo

Once we are in the directory we will need to create an application and we can do this by running the command
npm init

This will ask you a series of questions. Here are the answers I gave to the prompts.

The first step is to create a file that will contain our code for our Node.js server.

touch app.js

In our app.js we are going to add the following code to build a very simple Node.js Application.

var express = require("express");
var app = express();
var port = 3000;
 
app.get("/", (req, res) => {
  res.send("Hello World");
});
 
app.listen(port, () => {
  console.log("Server listening on port " + port);
});

What the code does is require the express.js application. It then creates app by calling express. We define our port to be 3000.

The app.use line will listen to requests from the browser and will return the text “Hello World” back to the browser.

The last line actually starts the server and tells it to listen on port 3000.

Installing Express

Our app.js required the Express.js module. We need to install express in order for this to work properly. Go to your terminal and enter this command.

npm install express --save

This command will install the express module into our package.json. The module is installed as a dependency in our package.json as shown below.

To test our application you can go to the terminal and enter the command

node app.js

Open up a browser and navigate to the url http://localhost:3000

You will see the following in your browser

Creating Website to Save Data to MongoDB Database

Instead of showing the text “Hello World” when people view your application, what we want to do is to show a place for user to save data to the database.

We are going to allow users to enter a first name and a last name that we will be saving in the database.

To do this we will need to create a basic HTML file. In your terminal enter the following command to create an index.html file.

touch index.html

In our index.html file we will be creating an input filed where users can input data that they want to have stored in the database. We will also need a button for users to click on that will add the data to the database.

Here is what our index.html file looks like.

<!DOCTYPE html>
<html>
  <head>
    <title>Intro to Node and MongoDB<title>
  <head>

  <body>
    <h1>Into to Node and MongoDB<&#47;h1>
    <form method="post" action="/addname">
      <label>Enter Your Name<&#47;label><br>
      <input type="text" name="firstName" placeholder="Enter first name..." required>
      <input type="text" name="lastName" placeholder="Enter last name..." required>
      <input type="submit" value="Add Name">
    </form>
  <body>
<html>

If you are familiar with HTML, you will not find anything unusual in our code for our index.html file. We are creating a form where users can input their first name and last name and then click an “Add Name” button.

The form will do a post call to the /addname endpoint. We will be talking about endpoints and post later in this tutorial.

Displaying our Website to Users

We were previously displaying the text “Hello World” to users when they visited our website. Now we want to display our html file that we created. To do this we will need to change the app.use line our our app.js file.

We will be using the sendFile command to show the index.html file. We will need to tell the server exactly where to find the index.html file. We can do that by using a node global call __dirname. The __dirname will provide the current directly where the command was run. We will then append the path to our index.html file.

The app.use lines will need to be changed to
app.use("/", (req, res) => {   res.sendFile(__dirname + "/index.html"); });

Once you have saved your app.js file, we can test it by going to terminal and running node app.js

Open your browser and navigate to “http://localhost:3000”. You will see the following

Connecting to the Database

Now we need to add our database to the application. We will be connecting to a MongoDB database. I am assuming that you already have MongoDB installed and running on your computer.

To connect to the MongoDB database we are going to use a module called Mongoose. We will need to install mongoose module just like we did with express. Go to your terminal and enter the following command.
npm install mongoose --save

This will install the mongoose model and add it as a dependency in our package.json.

Connecting to the Database

Now that we have the mongoose module installed, we need to connect to the database in our app.js file. MongoDB, by default, runs on port 27017. You connect to the database by telling it the location of the database and the name of the database.

In our app.js file after the line for the port and before the app.use line, enter the following two lines to get access to mongoose and to connect to the database. For the database, I am going to use “node-demo”.

var mongoose = require("mongoose"); mongoose.Promise = global.Promise; mongoose.connect("mongodb://localhost:27017/node-demo");

Creating a Database Schema

Once the user enters data in the input field and clicks the add button, we want the contents of the input field to be stored in the database. In order to know the format of the data in the database, we need to have a Schema.

For this tutorial, we will need a very simple Schema that has only two fields. I am going to call the field firstName and lastName. The data stored in both fields will be a String.

After connecting to the database in our app.js we need to define our Schema. Here are the lines you need to add to the app.js.
var nameSchema = new mongoose.Schema({   firstName: String,   lastNameName: String });

Once we have built our Schema, we need to create a model from it. I am going to call my model “DataInput”. Here is the line you will add next to create our mode.
var User = mongoose.model("User", nameSchema);

Creating RESTful API

Now that we have a connection to our database, we need to create the mechanism by which data will be added to the database. This is done through our REST API. We will need to create an endpoint that will be used to send data to our server. Once the server receives this data then it will store the data in the database.

An endpoint is a route that our server will be listening to to get data from the browser. We already have one route that we have created already in the application and that is the route that is listening at the endpoint “/” which is the homepage of our application.

HTTP Verbs in a REST API

The communication between the client(the browser) and the server is done through an HTTP verb. The most common HTTP verbs are
GET, PUT, POST, and DELETE.

The following table explains what each HTTP verb does.

HTTP Verb Operation
GET Read
POST Create
PUT Update
DELETE Delete

As you can see from these verbs, they form the basis of CRUD operations that I talked about previously.

Building a CRUD endpoint

If you remember, the form in our index.html file used a post method to call this endpoint. We will now create this endpoint.

In our previous endpoint we used a “GET” http verb to display the index.html file. We are going to do something very similar but instead of using “GET”, we are going to use “POST”. To get started this is what the framework of our endpoint will look like.

app.post("/addname", (req, res) => {
 
});
Express Middleware

To fill out the contents of our endpoint, we want to store the firstName and lastName entered by the user into the database. The values for firstName and lastName are in the body of the request that we send to the server. We want to capture that data, convert it to JSON and store it into the database.

Express.js version 4 removed all middleware. To parse the data in the body we will need to add middleware into our application to provide this functionality. We will be using the body-parser module. We need to install it, so in your terminal window enter the following command.

npm install body-parser --save

Once it is installed, we will need to require this module and configure it. The configuration will allow us to pass the data for firstName and lastName in the body to the server. It can also convert that data into JSON format. This will be handy because we can take this formatted data and save it directly into our database.

To add the body-parser middleware to our application and configure it, we can add the following lines directly after the line that sets our port.

var bodyParser = require('body-parser');
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
Saving data to database

Mongoose provides a save function that will take a JSON object and store it in the database. Our body-parser middleware, will convert the user’s input into the JSON format for us.

To save the data into the database, we need to create a new instance of our model that we created early. We will pass into this instance the user’s input. Once we have it then we just need to enter the command “save”.

Mongoose will return a promise on a save to the database. A promise is what is returned when the save to the database completes. This save will either finish successfully or it will fail. A promise provides two methods that will handle both of these scenarios.

If this save to the database was successful it will return to the .then segment of the promise. In this case we want to send text back the user to let them know the data was saved to the database.

If it fails it will return to the .catch segment of the promise. In this case, we want to send text back to the user telling them the data was not saved to the database. It is best practice to also change the statusCode that is returned from the default 200 to a 400. A 400 statusCode signifies that the operation failed.

Now putting all of this together here is what our final endpoint will look like.

app.post("/addname", (req, res) => {
  var myData = new User(req.body);
  myData.save()
    .then(item => {
      res.send("item saved to database");
    })
    .catch(err => {
      res.status(400).send("unable to save to database");
    });
});
Testing our code

Save your code. Go to your terminal and enter the command node app.js to start our server. Open up your browser and navigate to the URL “http://localhost:3000”. You will see our index.html file displayed to you.

Make sure you have mongo running.

Enter your first name and last name in the input fields and then click the “Add Name” button. You should get back text that says the name has been saved to the database like below.

Access to Code

The final version of the code is available in my Github repo. To access the code click here. Thank you for reading !