React, NodeJS, Express & MongoDB - The MERN Stack Guide for Beginners

React, NodeJS, Express & MongoDB -  The MERN Stack Guide for Beginners

ReactJS and NodeJS, together with ExpressJS & MongoDB form the very popular MERN stack! In this video guide you Building fullstack applications (i.e. frontend + backend) with the MERN stack is very popular - in this course, you will learn it from scratch at the example of a complete project!

MERN stands for MongoDB, Express.js, React.js and Node.js - and combined, these four technologies allow you to build amazing web applications.

NodeJS API Development with Express MongoDB and Mongoose

NodeJS API Development with Express MongoDB and Mongoose

NodeJS API Development with Express MongoDB and Mongoose. MongoDB is a NoSQL document-oriented database. It’s popular in the Node.js community and a viable database solution for building real-world applications.

MongoDB is different from traditional, SQL databases like MySQL and PostgreSQL in that data is stored in binary JSON-like objects called BSON). This structure lends itself well to building Javascript applications that communicate with JSON. Additionally, MongoDB has flexible schema. This means there aren’t database migrations to worry about and data models can grow and change.

In this tutorial, we’re going to set up a Node.js server application, connect it to MongoDB and demonstrate how relationships work between MongoDB Collections. In the table below (provided by MongoDB) you’ll see how traditional aspects of SQL databases stack up against their MongoDB equivalents. You can find the whole source code for this tutorial in this GitHub repo.

In SQL databases, we get database relationships using joins. For example, if we had a SQL database with two tables, books and authors, we could get all the books that belong to an author like so:

SELECT b.id AS ‘Post ID’, 
b.title AS ‘Book Title’, 
a.name AS ‘Author Name`, 
a.id AS ‘Author ID’
FROM books b
JOIN authors ON b.author_id = a.id
WHERE a.id = 1234;

This will grab information from both tables and display the results in a single dataset for us. Frameworks like Ruby On Rails and Laravel have abstracted this functionality for developers, making it possible to write PHP or Ruby to grab related information.

In Ruby On Rails, using Active Record finding, an author and related posts could look like:

authorWithBooks = Author.find 1234, :include => [:books]

In Laravel, using Eloquent, we could do:

$authorWithBooks = Author::find(1234)->books();

These results would give us the author with id 1234 and all the books that they've written. On the books table, we'd store an author_id, setting up the relationship between authors and books in the SQL world. MongoDB doesn't use joins though, so how do we achieve this functionality?

There is a helper npm package for working with MongoDB called mongoose that we’re going to use for illustrative purposes in this tutorial. Mongoose is an ORM (stands for Object Relationship Mapper) that is a helper for MongoDB kind of like how ActiveRecord and Eloquent are helpers for working with relational data.

Create Database Models with Mongoosejs

The first thing to do is set up our models in Mongoose. These schemas are flexible but help us define what we want our data to look like.

For the author model, we define a model schema that can reference documents in another collection:

const mongoose = require('mongoose');
const authorModel = mongoose.Schema({
  name: { 
   type: String, 
   required: '{PATH} is required!'
  },
  bio: {
   type: String
  },
  website: {
   type: String
  },
  books: [
    { type: mongoose.Schema.Types.ObjectId, ref: 'Book' }
  ]
}, {
  timestamps: true
});
module.exports = mongoose.model('Author', authorModel);

In the above model, we define that in the authors MongoDB collection authors have name, bio, website and an array of books. Each element in the books array will reference the book id on the books collection. We’ll define that below. The second argument, saying timestamps = true will include "updated at" and "created at" fields when we create author records.

The Books schema models what our book data will look like. The schema has a reference to find the id of an associated author. In this example, I’m saying that a book is written by only one author, though in the real world that’s not always the case! Here’s what a belongs-to relationship could look like using Mongoose.js:

const mongoose = require('mongoose');
const bookModel = mongoose.Schema({
  title: { 
    type: String, 
    required: '{PATH} is required!'
  },
  subtitle: {
    type: String
  },
  author: { 
    type: mongoose.Schema.Types.ObjectId, 
    ref: 'Author' 
  }
}, {
  timestamps: true
});
module.exports = mongoose.model('Book', bookModel);

Instead of an array of authors, the book references one author id as the author of the book. We’re using timestamps again for the “updated at” and “created at fields”.

In the root models directory, I added an index to register the models:

module.exports = {
  'Author': require('./Author'),
  'Book': require('./Book'),
};
Register Routes to Return JSON From Express 4

Now that we have the authors and book models defined, it’s time to return and show the data via a JSON API. For that, I set up a controllers for Authors called AuthorsController and one for Books called BooksController. The controllers are responsible for handling the request after the router determines which route to use. Below, we'll define a method for rendering a JSON response of all authors and the JSON of one author based on an id.

The authors controller looks like this:

const { Author } = require('../models');
const AuthorsController = {
  async index(req, res){
    const authors = await Author
       .find()
       .populate('books');
    res.send(authors);
  },
  async show(req, res){
    const author = await Author
       .findById(req.params.id)
       .populate(‘books’);
    res.send(author);
  }
};
module.exports = AuthorsController;

Here, I’m importing the author model, grabbing all of them and populating the query result with the related books. To use the async-await functionality with Express 4, I pulled in a package called express-async-errors and registered it in like so: require('express-async-errors');.

Following that Express 4 requires some server boilerplate setup:

app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
app.use(cors());
app.use(methodOverride());
app.use(cookieParser());
app.use(express.static(__dirname + '/public'));
require('./server/routes')(app);

In the **server/routes.js** file, I register API routes for showing all the authors with their books and individual author with their books:

const express = require('express'),
  path = require('path'),
  rootPath = path.normalize(__dirname + '/../'),
  router = express.Router(),
  { AuthorsController, 
    BooksController } = require('./controllers');
module.exports = function(app){
  router.get('/authors', AuthorsController.index);
  router.get('/authors/:id', AuthorsController.show);
  app.use('/api', router);
};

Now we have a working API that returns authors with the books that they’ve written. The only problem is that there are no authors or books stored in MongoDB yet! To fix that, we’ll need to set up code to seed the database with records. If you visit /api/authors now all you'll see is an empty array.

Seed Records Into MongoDB

We need to make sure that the Express 4 server connects properly to MongoDB. For that, we can connect via an URL and listen for successful connection events like so:

const mongoose = require('mongoose'),
  env = process.env.NODE_ENV = process.env.NODE_ENV || 'development',
  envConfig = require('../server/env')[env];
mongoose.Promise = require('bluebird');
mongoose.connect(envConfig.db, { useMongoClient: true, });
mongoose.connection.on('connected', function () {  
  console.log(`Database connection open to ${mongoose.connection.host} ${mongoose.connection.name}`);
});
mongoose.connection.on('error',function (err) {  
  console.log('Mongoose default connection error: ' + err);
});
mongoose.connection.on('disconnected', function () {  
  console.log('Mongoose default connection disconnected'); 
});

With the environment config file defined like so:

var path = require('path'),
  rootPath = path.normalize(__dirname + '/../../');
  
module.exports = {
  development: {
    rootPath: rootPath,
    db: 'mongodb://localhost/mongodb-relationships',
    port: process.env.PORT || 3000
  },
  production: {
    rootPath: rootPath,
    db: process.env.MONGOLAB_URI || 'you can add a mongolab uri here ($ heroku config | grep MONGOLAB_URI)',
    port: process.env.PORT || 80
  }
};

The seeder itself we’re going to run from the command line. It’s a bit verbose but goes through the process of creating and updating records in MongoDB with Mongoose.js.

require('./index');
const mongoose = require('mongoose');
const { Author, Book } = require('../server/models');
async function seedAuthors() {
  console.log('Seeding authors to ' + mongoose.connection.name + '...');
  const authors = [
    { name: 'JK Rowling', bio: 'J.K. Rowling is the author of the much-loved series of seven Harry Potter novels, originally published between 1997 and 2007.' },
    { name: 'Tony Robbins', bio: 'Tony Robbins is an entrepreneur, best-selling author, philanthropist and the nation\'s #1 Life and Business Strategist.' },
  ];
for (author of authors) {
    var newAuthor = new Author(author);
    await newAuthor.save();
  }
const a = await Author.find();
  console.log('authors: ', a);
}
async function seedBooks() {
  console.log('Seeding books to ' + mongoose.connection.name + '...');
const jkRowling = await Author.findOne({ name: 'JK Rowling' });
  const tonyRobbins = await Author.findOne({ name: 'Tony Robbins' });
let harryPotter = new Book({ title: 'Harry Potter', author: jkRowling._id });
  let awakenGiant = new Book({ title: 'Awaken the Giant Within', author: tonyRobbins._id });
await harryPotter.save();
  await awakenGiant.save();
jkRowling.books.push(harryPotter);
  tonyRobbins.books.push(awakenGiant);
await jkRowling.save();
  await tonyRobbins.save();
}
seedAuthors();
seedBooks();

This will create a new connection to the MongoDB database and then convert a normal array of JavaScript objects into data we can persistently access. The author will have an array of books with one book for each author in the array. To add more books, we can push to the books array and save the changes. Each book will have one author. MongoDB stores these relationships via the id. Using the populate method in our controller above, we'll be able to view the entire object.

After running the seeder, you should be able to see your records in MongoDB Compass, as shown below. Compass is a GUI for viewing, creating, deleting, querying and editing MongoDB data.

Test The API

Now, to view this data from MongoDB via the API, start the Node server with npm run start and visit localhost:3000/api/authors in the web browser.

The final data will look something like:

[ 
   { 
      "_id":"5d51ea23acaf6f3380bcab56",
      "updatedAt":"2019-08-12T22:38:46.925Z",
      "createdAt":"2019-08-12T22:37:23.430Z",
      "name":"JK Rowling",
      "bio":"J.K. Rowling is the author of the much-loved series of seven Harry Potter novels, originally published between 1997 and 2007.",
      "__v":1,
      "books":[ 
         { 
            "_id":"5d51ea76f607f9339d5a76f6",
            "updatedAt":"2019-08-12T22:38:46.919Z",
            "createdAt":"2019-08-12T22:38:46.919Z",
            "title":"Harry Potter",
            "author":"5d51ea23acaf6f3380bcab56",
            "__v":0
         }
      ]
   },
   { 
      "_id":"5d51ea23acaf6f3380bcab57",
      "updatedAt":"2019-08-12T22:38:46.937Z",
      "createdAt":"2019-08-12T22:37:23.475Z",
      "name":"Tony Robbins",
      "bio":"Tony Robbins is an entrepreneur, best-selling author, philanthropist and the nation's #1 Life and Business Strategist.",
      "__v":1,
      "books":[ 
         { 
            "_id":"5d51ea76f607f9339d5a76f7",
            "updatedAt":"2019-08-12T22:38:46.921Z",
            "createdAt":"2019-08-12T22:38:46.921Z",
            "title":"Awaken the Giant Within",
            "author":"5d51ea23acaf6f3380bcab57",
            "__v":0
         }
      ]
   }
]

Congratulations, you’ve built an API with Node.js, Express 4 and MongoDB!

Lastly, a word from the Jscrambler team — before shipping your web apps, make sure you are protecting their JavaScript source code against reverse-engineering, abuse, and tampering. 2 minutes is all it takes to begin your free Jscrambler trial and start protecting JavaScript.

How to create a full stack React/Express/MongoDB app using Docker

How to create a full stack React/Express/MongoDB app using Docker

In this tutorial, I will guide you through the process of containerizing a React FrontEnd, a Node / Express API, and a MongoDB database using Docker containers in a very simple way.

In this tutorial, I will guide you through the process of containerizing a React FrontEnd, a Node / Express API, and a MongoDB database using Docker containers in a very simple way.

Why should you care about Docker?

Docker is simply one of the most important technologies at the moment. It lets you run apps inside containers that are mostly isolated from “everything”.

Each container is like an individual virtual machine stripped out of everything that is not needed to run your app. This makes containers very light, fast and secure.

Containers are also meant to be disposable. If one goes rogue, you can kill it and make another just like it with no effort thanks to the container images system.

Another thing that makes Docker great is that the app inside containers will run the same in every system (Windows, Mac, or Linux). This is awesome if you are developing in your machine and then you want to deploy it to some cloud provider like GCP or AWS.

Docker containers everywhere!

Containerizing your app with Docker is as simple as creating a Dockerfile for each of your apps to first build an image, and then running each image to get your containers live.

Containerize your Client

To build our Client image you will be needing a Dockerfile. Let’s create one:

  1. Open the React / Express App in your favorite code editor (I’m using VS Code).
  2. Navigate to the Client folder.
  3. Create a new file named Dockerfile.
  4. Place this code inside it:
# Use a lighter version of Node as a parent imageFROM mhart/alpine-node:8.11.4
# Set the working directory to /clientWORKDIR /client
# copy package.json into the container at /clientCOPY package*.json /client/
# install dependenciesRUN npm install
# Copy the current directory contents into the container at /clientCOPY . /client/
# Make port 3000 available to the world outside this containerEXPOSE 3000
# Run the app when the container launchesCMD ["npm", "start"]

This will instruct docker to build an image (using these configurations) for our Client. You can read all about Dokerfile here.

Containerize your API

To build our API image you will be needing another Dockerfile. Let’s create it:

  1. Navigate to the API folder.
  2. Create a new file named Dockerfile.
  3. Place this code inside it:
# Use a lighter version of Node as a parent imageFROM mhart/alpine-node:8.11.4
# Set the working directory to /apiWORKDIR /api
# copy package.json into the container at /apiCOPY package*.json /api/
# install dependenciesRUN npm install
# Copy the current directory contents into the container at /apiCOPY . /api/
# Make port 80 available to the world outside this containerEXPOSE 80
# Run the app when the container launchesCMD ["npm", "start"]

This will instruct docker to build an image (using these configurations) for our API. You can read all about Dokerfile here.

Docker Compose

You could run each individual container using the Dokerfiles. In our case we have 3 containers to manage, so we will use docker-compose instead. Compose is a tool for defining and running multi-container Docker applications.

Let me show you how simple it is to use it:

  1. Open the React/Express App in your code editor.
  2. On your App main folder, create a new file and name it docker-compose.yml.
  3. Write this code in the docker-compose.yml file:
version: "2"

services: client: image: webapp-client restart: always ports: - "3000:3000" volumes: - ./client:/client - /client/node_modules links: - api networks: webappnetwork

api: image: webapp-api restart: always ports: - "9000:9000" volumes: - ./api:/api - /api/node_modules depends_on: - mongodb networks: webappnetwork

What sorcery is that?

You should read all about docker-compose here.

Basically, I’m telling Docker that I want to build a container called client, using the image **webapp-client (which is the image we defined on our Client Dockerfile) **that will be listening on port 3000. Then, I’m telling it that I want to build a container called api using the image webapp-api (which is the image we defined on our API Dockerfile) that will be listening on port 9000.

Keep in mind that there are many ways of writing a docker-compose.yml file. You should explore the documentation and use what better suits your needs.##

Add a MongoDB database

To add a MongoDB database is as simple as adding these lines of code to your docker-compose.yml file:

 mongodb: image: mongo restart: always container_name: mongodb volumes: - ./data-node:/data/db ports: - 27017:27017 command: mongod --noauth --smallfiles networks: - webappnetwork

This will create a container using the official MongoDB image.

Create a shared network for your containers

To create a shared network for your container just add the following code to your docker-compose.yml file:

networks: webappnetwork: driver: bridge

Notice that you already defined each container of your app to use this network.

In the end, your docker-compose.yml file should be something like this:

docker-compose.yml

In the docker-compose.yml file, the indentation matters. Be aware of that.

Get your containers running

  1. Now that you have a docker-compose.yml** **file, let’s build your images. Go to the terminal and on your App’s main directory run:
docker-compose build

  1. Now, to make Docker spin up the containers, just run:
docker-compose up

And… just like magic, you now have your Client, your API, and your Database, all running in separated containers with only one command. How cool is that?

Connect your API to MongoDB
  1. First, let’s install Mongoose to help us with the connection to MongoDB. On your terminal type:
npm install mongoose

  1. Now create a file called testDB.js on your API routes folder and insert this code:
const express = require("express");const router = express.Router();const mongoose = require("mongoose");

// Variable to be sent to Frontend with Database statuslet databaseConnection = "Waiting for Database response...";

router.get("/", function(req, res, next) { res.send(databaseConnection);});

// Connecting to MongoDBmongoose.connect("mongodb://mongodb:27017/test");

// If there is a connection error send an error messagemongoose.connection.on("error", error => { console.log("Database connection error:", error); databaseConnection = "Error connecting to Database";});

// If connected to MongoDB send a success messagemongoose.connection.once("open", () => { console.log("Connected to Database!"); databaseConnection = "Connected to Database";});

module.exports = router;

Ok, let’s see what this code is doing. First, I import Express, ExpressRouter, and Mongoose to be used on our /testDB route. Then I create a variable that will be sent as a response telling what happened with the request. Then I connect to the database using Mongoose.connect(). Then I check if the connection is working or not, and change the variable (I’ve created earlier) accordingly. Finally, I use module.exports to export this route so that I’m able to use it on app.js file.

  1. Now you have to “tell” Express to use that route you’ve just created. On your API folder, open the app.js file and insert this two lines of code:
var testDBRouter = require("./routes/testDB");app.use("/testDB", testDBRouter);

This will “tell” Express that every time there is a request to the endpoint /testDB, it should use the instructions on the file testDB.js.

  1. Now let’s test if everything is working properly. Go to your terminal and press control + C to bring your containers down. Then run docker-compose up to bring them back up again. After everything is up and running, if you navigate to http://localhost:9000/testDB you should see the message Connected to Database.

In the end, your app.js file should look like this:

api/app.js

Yep… it means the API is now connected to the database. But your FrontEnd doesn’t know yet. Let’s work on that now.

Make a request from React to the Database

To check if the React app can reach the Database let’s make a simple request to the endpoint you defined on the previous step.

  1. Go to your Client folder and open the App.js file.
  2. Now insert this code below the callAPI() method:
callDB() {    fetch("http://localhost:9000/testDB")        .then(res => res.text())        .then(res =>; this.setState({ dbResponse: res }))        .catch(err => err);}

This method will fetch the endpoint you defined earlier on the API and retrieve the response. Then it will store the response in the state of the component**.**

  1. Add a variable to the state of the component to store the response:
dbResponse: ""

  1. Inside the lifecycle method **componentDidMount(), **insert this code to execute the method you’ve just created when the component mounts:
this.callDB();

  1. Finally, add another <;p> tag after the one you already have to display the response from the Database:
<p className="App-intro">;{this.state.dbResponse}</p>

In the end, your App.js file should end up like this:

client/App.js

Finally, let’s see if everything is working

On your browser, go to http://localhost:3000/ and if everything is working properly, you should see these three messages :

  1. Welcome to React
  2. API is working properly
  3. Connected to Database

Something like this:

http://localhost:3000/

Congratulations!!!

You now have a full stack app with a React FrontEnd, a Node/Express API and a MongoDB database. All running inside individual Docker containers that are orchestrated with a simple docker-compose file.

This app can be used as a boilerplate to build your more robust app.

You can find all the code I wrote in the project repository.

Creating Real-time Chat App with Nodejs, Express, Socket and MongoDB

Creating Real-time Chat App with Nodejs, Express, Socket and MongoDB

In this tutorial, we’ll be building a real-time chat application with NodeJS, Express, Socket.io, and MongoDB.

How to Build a Real-time Chat App With NodeJS, Socket.IO, and MongoDB

In this post, we’ll be building a real-time chat application with NodeJS, Express, Socket.io, and MongoDB.

Here is a screenshot of what we’ll build:

Setup

I’ll assume that you already have NodeJS and NPM installed. You can install it from the Node.js website if you don’t have it installed already.

A basic Knowledge of Javascript is required.

Let’s get started.

Create a directory for the application, open the directory with your favourite editor such as Visual Studio Code. You can use any other editor, I’ll be using VS code in this tutorial:

mkdir chatApplication && cd chatApplication && code .

Next, let’s initialize the directory as a Nodejs application.

npm init

You’ll be prompted to fill in some information — that’s okay. The information will be used to set up your package.json file.

Dependencies Installation

Let’s install our application’s dependencies.

We’ll be using the express web server to serve our static files and body-parserextract the entire body portion of an incoming request stream and exposes it to an API endpoint. So, let's install them. You'll see how they are used later in this tutorial.

npm install express body-parser --save

We added the — save flag so that it’ll be added as a dependency in our package.json file.

Note:

Please, don’t use express generator as I won’t cover how to configure _socket.io_ to work with express generator setup.

Next, install the mongoose node module. It is an ODM (Object Document Mapper) for MongoDB and it’ll make our job a lot easier.

Let’s install it alongside socket.io and bluebird. Socket.IO is a JavaScript library for real-time web applications. Bluebird is a fully-featured Promise library for JavaScript.

npm install mongoose socket.io bluebird --save

That’s it for the Nodejs backend module installation.

Our package.json file should look like this now.

Another way to install the above packages is to copy the package.json file above and paste it into your package.json file and run:

npm install

It’ll install all the required packages.

Let’s set up the client side.

To connect Socket.IO server to the client we add the Socket.IO client-side javascript library.

<script  src="/js/socket.js"></script>

That will be our HTML file for the frontend. You can grab the entire code for the frontend here to follow along. The best way to learn is to follow along.

You can download the client-side socket.io library here as well.

And here /js/chat.js is where we’ll have our custom client-side javascript code.

Setting up our express server:

Create an App.js. You can call it server.js if you like.
It’s my personal preference to call it App.js.

Inside the App.js file let’s create and configure the express server to work with socket.io.

App.js

This is the basic configuration required to set up socket.io in the backend.

Socket.IO works by adding event listeners to an instance of http.Server
which is what we are doing here:

const socket = io(http);

Here is where we listen to new connection events:

socket.on(“connection”, (socket)=>{
console.log(“user connected”);
});

For example, if a new user visits localhost:500 the message “user connected” will be printed on the console.

socket.on() takes an event name and a callback as parameters.

And there is also a special disconnect event that gets fire each time a user closes the tab.

socket.on(“connection”, (socket)=>{
    console.log(“user connected”);
    socket.on("disconnect", ()=>{
    console.log("Disconnected")
})
});
Setting up our frontend code

Open up your js/chat.js file and type the following code:

(function() {
    var  socket  =  io();
    $("form").submit(function(e) {
        e.preventDefault(); // prevents page reloading
        socket.emit("chat message", $("#m").val());
        $("#m").val("");
    return  true;
});
})();

This is a self-executing function it initializes socket.io on the client side and emits the message typed into the input box.

With this line of code, we create a global instance of the soicket.io client on the frontend.

var  socket  =  io();

And inside the submit event handler, socket io is getting our chat from the text box and emitting it to the server.

$("form").submit(function(e) {
    e.preventDefault(); // prevents page reloading
    socket.emit("chat message", $("#m").val());
    $("#m").val("");
 return  true;
});

If you’ve gotten to this point, congratulations, you deserve some accolades.

Great, we have both our express and socket.io server set up to work well. In fact, we’ve been able to send messages to the server by emitting the message from our input box.

socket.emit("chat message", $("#m").val());

Now from the server-side let’s set up an event to listen to the “chat message” event and broadcast it to clients connected on port 500.

App.js

socket.on("chat message", function(msg) {

console.log("message: "  +  msg);

//broadcast message to everyone in port:5000 except yourself.

socket.broadcast.emit("received", { message: msg  });
   });
});

This is the event handler that listens to the “chat message” event and the message received is in the parameter passed to the callback function.

socket.on("chat message", function(msg){
});

Inside this event, we can choose what we do with the message from the client — -insert it into the database, send it back to the client, etc.

In our case, we’ll be saving it into the database and also sending it to the client.

We’ll broadcast it. That means the server will send it to every other person connected to the server apart from the sender.

So, if Mr A sends the message to the server and the server broadcasts it, Mr B, C, D, etc will receive it but Mr A won’t.

We don’t want to receive a message we sent, do we?

That doesn’t mean we can’t receive a message we sent as well. If we remove the broadcast flag we’ll also remove the message.

Here is how to broadcast an event:

socket.broadcast.emit("received",{message:msg})

With that out of the way, we can take the message received and append it to our UI.

If you run your application. You should see something similar to this. Please, don’t laugh at my live chat.

Wawu! Congratulations once again. let’s add some database stuff and display our chats on the frontend.

Database Setup

Install MongoDB

Visit the MongoDB website to download it if you have not done so already.

And make sure your MongoDB server is running. They have an excellent documentation that details how to go about setting it up and to get it up and running. You can find the doc here.

Create Chat Schema

Create a file in the model’s directory called models/ChatSchema.js
Nothing complex, we are just going to have 3 fields in our schema --- a message field, a sender field and a timestamp.

The ChatSchema.js file should look like this:

Connection to the MongoDB database

Create a file and name it dbconnection.js. That's where our database connection will live.

const  mongoose  = require("mongoose");
mongoose.Promise  = require("bluebird");

const  url  =  "mongodb://localhost:27017/chat";
const  connect  =  mongoose.connect(url, { useNewUrlParser: true  });

module.exports  =  connect;
Insert messages into the database

Since we are going to insert the messages in the server-side we’ll be inserting the messages we receive from the frontend in the App.js file.

So, let’s update the App.js file.

...
//database connection

const  Chat  = require("./models/Chat");
const  connect  = require("./dbconnect");

We can now add the

//setup event listener
socket.on("connection", socket  =>  {
    console.log("user connected");
    socket.on("disconnect", function() {
		
    console.log("user disconnected");
    });  
    socket.on("chat message", function(msg) {
        console.log("message: "  +  msg);
        //broadcast message to everyone in port:5000 except yourself.
    socket.broadcast.emit("received", { message: msg  });
		
    //save chat to the database
    connect.then(db  =>  {
    console.log("connected correctly to the server");
		
    let  chatMessage  =  new Chat({ message: msg, sender: "Anonymous"});
    chatMessage.save();
    });
    });
});

We are creating a new document and saving it into the Chat collection in the database.

let  chatMessage  =  new Chat({ message: msg, sender: "Anonymous"});
    chatMessage.save();
Display messages on the frontend

We’ll, first of all, display our message history from the database and append all messages emitted by events.

To achieve this, we need to create an API that sends the data from the database to the client-side when we send a get request.

const  express  = require("express");
const  connectdb  = require("./../dbconnect");
const  Chats  = require("./../models/Chat");

const  router  =  express.Router();

router.route("/").get((req, res, next) =>  {
        res.setHeader("Content-Type", "application/json");
        res.statusCode  =  200;
        connectdb.then(db  =>  {
            Chats.find({}).then(chat  =>  {
            res.json(chat);
        });
    });
});

module.exports  =  router;

In the above code, we query the database and fetch all the messages in the Chat collection.

We’ll import this into the server code App.js file and we'll also import the bodyparser middleware as well.


const bodyParser = require(“body-parser”);
const chatRouter = require(“./route/chatroute”);

//bodyparser middleware
app.use(bodyParser.json());

//routes
app.use(“/chats”, chatRouter);

With this out of the way, we are set to access our API from the frontend and get all the messages in our Chat collection.

So, we got the messages using the fetch API and we appended the messages to the UI.

You’ll also notice that I used formatTimeAgo(data.createdAt)); that is a 1.31kb library I created to manage dates for small projects since moment.js sometimes is rather too big. formatTimeAgo() will display “few seconds ago”, etc. If you are interested, you can find more information here.

Everything seems good at this point, right?

However, since you are not receiving the messages sent to the server back to yourself, let’s grab our own message from our input box and display it on the UI.


(function() {
$(“form”).submit(function(e) {

let li = document.createElement(“li”);

e.preventDefault(); // prevents page reloading

socket.emit(“chat message”, $(“#message”).val());

messages.appendChild(li).append($(“#message”).val());

let span = document.createElement(“span”);

messages.appendChild(span).append(“by “ + “Anonymous” + “: “ + “just now”);

$(“#message”).val(“”);

return false;

  });
})();

And also if we receive messages from the event let’s also output it to the UI.



(function(){

socket.on("received", data => {

let li = document.createElement("li");

let span = document.createElement("span");

var messages = document.getElementById("messages");

messages.appendChild(li).append(data.message);

messages.appendChild(span).append("by " + "anonymous" + ": " + "just now");

});
`
})

Our application is complete now. Go ahead an test it.

Note that if we had our users logged in we wouldn’t have hardcoded the “anonymous” user as it’s in our code right now. We’ll get it from the server.

And also if you want to tell everyone that someone is typing you can also add this code in the frontend.


//isTyping event
messageInput.addEventListener(“keypress”, () => {
socket.emit(“typing”, { user: “Someone”, message: “is typing…” });
});

socket.on(“notifyTyping”, data => {
typing.innerText = data.user + “ “ + data.message;
console.log(data.user + data.message);
});

//stop typing
messageInput.addEventListener(“keyup”, () => {
socket.emit(“stopTyping”, “”);
});

socket.on(“notifyStopTyping”, () => {
typing.innerText = “”;});
`

What it does is that when a user is typing it emits an event to the server and the server broadcasts it to other clients. You listen to the event and update the UI with the message “Someone is typing…” You can add the person’s name if you wish.

Here is the server-side event listener and emitter:

Congratulations.

You can improve this code, add authentication, add groups or make it a one to one chat, re-model the schema to accommodate all of that, etc.

I’ll be super excited to see the real-time applications you’ll build with socket.IO.

I hope this was helpful. The entire code is on Github. You can get it here.

Thank you for reading !