How to set up a logging Infrastructure in Node.js

How to set up a logging Infrastructure in Node.js

How to set up a logging Infrastructureusing using ElasticSearch, Fluentd, and Kibana.Logging is one of those Node.js functions that's easy to take for ... and efficiently troubleshoot infrastructure and application issues. Logging Best Practices. There are a number of best practices to follow when setting up your .

Setting up the right logging infrastructure helps us in finding what happened, debugging and monitoring the application. At a very basic level, we should expect the following from our infrastructure:

  • Ability to free text search on our logs
  • Ability to search for specific api logs
  • Ability to search per statusCode of all the APIs
  • System should scale as we add more data into our logs


Architecture Digram using ElasticSearch, Fluentd and Kibana Architecture Digram using ElasticSearch, Fluentd and Kibana

Local Setup

We would be using Docker for managing our services.

Let’s get the ElasticSearch up and running with the following command

docker run -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" --name myES

We can check if our container is up and running by the following command

curl -X GET "localhost:9200/_cat/nodes?v&pretty"


We can get our Kibana up and running with another docker run command.

docker run —-link myES:elasticsearch -p 5601:5601 kibana:7.4.1

Note that we are linking our kibana and elastic search server using the --link command

If we go to [http://localhost:5601/app/kibana](http://localhost:5601/app/kibana) , we would be able to see our kibana dashboard.

We can now run all the queries against our elastic search cluster using kibana. We can navigate to


and run the query we ran before (just a little less verbose)

query for elastic cluster nodes using kibana

query for elastic cluster nodes using kibana


Fluentd is the place where all the data formatting will happen.

Let’s first build our Dockerfile. It does two things:

  • install the necessary packages
  • Copy our config file into the docker file


FROM fluent/fluentd:latest
MAINTAINER Abhinav Dhasmana <[email protected]>

USER root

RUN apk add --no-cache --update --virtual .build-deps \
  sudo build-base ruby-dev \
  && sudo gem install fluent-plugin-elasticsearch \
  && sudo gem install fluent-plugin-record-modifier \
  && sudo gem install fluent-plugin-concat \
  && sudo gem install fluent-plugin-multi-format-parser \
  && sudo gem sources --clear-all \
  && apk del .build-deps \
  && rm -rf /home/fluent/.gem/ruby/2.5.0/cache/*.gem

COPY fluent.conf /fluentd/etc/


# Recieve events over http from port 9880
  @type http
  port 9880

# Recieve events from 24224/tcp
  @type forward
  port 24224

# We need to massage the data before if goes into the ES
<filter **>
  # We parse the input with key "log" (
  @type parser
  key_name log
  # Keep the original key value pair in the result
  reserve_data true
    # Use apache2 parser plugin to parse the data
    @type multi_format
      format apache2
      format json
      time_key timestamp
      format none

# Fluentd will decide what to do here if the event is matched
# In our case, we want all the data to be matched hence **
<match **>
# We want all the data to be copied to elasticsearch using inbuilt
# copy output plugin
  @type copy
  # We want to store our data to elastic search using out_elasticsearch plugin
  # See Dockerfile for installation
    @type elasticsearch
    time_key timestamp_ms
    port 9200
    # Use conventional index name format (logstash-%Y.%m.%d)
    logstash_format true
    # We will use this when kibana reads logs from ES
    logstash_prefix fluentd
    logstash_dateformat %Y-%m-%d
    flush_interval 1s
    reload_connections false
    reconnect_on_error true
    reload_on_failure true

Config file for the fluent

Let’s spin this docker machine up

docker build -t abhinavdhasmana/fluentd .
docker run -p 9880:9880  --network host  abhinavdhasmana/fluentd

Node.js App

I have created a small Node.js app for demo purposes which you can find here. It’s a small express app created using express generator. It is using morgan to generate logs in the apache format. You can use your own app in your preferred language. As long as the output remains the same, our infrastructure does not care. Let’s build our docker image and run it.

docker build -t abhinavdhasmana/logging .

Of course we can get all the docker containers up by a single docker compose file given below


version: "3"
    build: "./fluentd"
      - "9880:9880"
      - "24224:24224"
    network_mode: "host"
    build: .
      - "3000:3000"
      - fluentd
      driver: "fluentd"
        fluentd-address: localhost:24224
    image: elasticsearch:7.4.1
      - "9200:9200"
      - "9300:9300"
      - discovery.type=single-node
    image: kibana:7.4.1
      - "elasticsearch"
      - "5601:5601"

docker compose file for the EFK setup

That’s it. Our infrastructure is ready. Now we can generate some logs by going to http://localhost:3000

We now go to kibana dashboard again and define the index to use

setting up index for use in kibana setting up index for use in kibana

Note that in our fluent.conf , we mentioned logstash_prefix fluentd and hence we use the same string here. Next are some basic kibana settings

kibana configure settings kibana configure settings

Elastic Search using dynamic mapping to guess the type of the fields that it indexes. The below snapshot shows these

Mapping example of Elastic Search Mapping example of Elastic Search Let’s check on how are we doing with the requirements we mentioned at the start:

  • Ability to free text search on our logs: With the help of ES and kibana, we can search on any field and we are able to get the result.
  • Ability to search for specific api logs: In the “Available fieldssection on the left of the kibana, we can see a fieldpath` . We can apply the filter on this to look for APIs that we are interested in.
  • Ability to search per **statusCode** of all the APIs: Same as above. Use code field and apply filter.
  • System should scale as we add more data into our logs: We started our elastic search in the single node mode with the following env variable discovery.type=single-node . We can start in a cluster mode, add more nodes or use a hosted solution on any cloud provider of our choice. I have tried AWS and its easy to set it up. AWS also give managed kibana instance for Elasticsearch at no extra cost.

node-js nodejs javascript

Bootstrap 5 Complete Course with Examples

Bootstrap 5 Tutorial - Bootstrap 5 Crash Course for Beginners

Nest.JS Tutorial for Beginners

Hello Vue 3: A First Look at Vue 3 and the Composition API

Building a simple Applications with Vue 3

Deno Crash Course: Explore Deno and Create a full REST API with Deno

How to Build a Real-time Chat App with Deno and WebSockets

Convert HTML to Markdown Online

HTML entity encoder decoder Online

How to Hire Node.js Developers And How Much Does It Cost?

A Guide to Hire Node.js Developers who can help you create fast and efficient web applications. Also, know how much does it cost to hire Node.js Developers.

Hire Node.JS Developers | Skenix Infotech

We are providing robust Node.JS Development Services with expert Node.js Developers. Get affordable Node.JS Web Development services from Skenix Infotech.

Top Node.js Development Companies and Expert NodeJS Developers

A thoroughly researched list of top NodeJS development companies with ratings & reviews to help hire the best Node.JS developers who provide development services and solutions across the world. List of Leading Node.js development Service Providers...

Hands on with Node.Js Streams | Examples & Approach

The practical implications of having Streams in Node.js are vast. Nodejs Streams are a great way to handle data chunks and uncomplicate development.

Node.js for Beginners - Learn Node.js from Scratch (Step by Step)

Node.js for Beginners - Learn Node.js from Scratch (Step by Step) - Learn the basics of Node.js. This Node.js tutorial will guide you step by step so that you will learn basics and theory of every part. Learn to use Node.js like a professional. You’ll learn: Basic Of Node, Modules, NPM In Node, Event, Email, Uploading File, Advance Of Node.