Getting started with modules in Node.js: module.exports and exports

Getting started with modules in Node.js: module.exports and exports

In this Node.js tutorila, you'll learn how to work with modules in Node.js, explaining the difference between module.exports and exports and focusing on how to export and consume them.

In programming, modules are self-contained units of functionality that can be shared and reused across projects. They make our lives as developers easier, as we can use them to augment our applications with functionality that we haven’t had to write ourselves. They also allow us to organize and decouple our code, leading to applications that are easier to understand, debug and maintain.

In this article, I’ll examine how to work with modules in Node.js, focusing on how to export and consume them.

Different Module Formats

As JavaScript originally had no concept of modules, a variety of competing formats have emerged over time. Here’s a list of the main ones to be aware of:

  • The Asynchronous Module Definition (AMD) format is used in browsers and uses a define function to define modules.
  • The CommonJS (CJS) format is used in Node.js and uses require and module.exports to define dependencies and modules. The npm ecosystem is built upon this format.
  • The ES Module (ESM) format. As of ES6 (ES2015), JavaScript supports a native module format. It uses an export keyword to export a module’s public API and an import keyword to import it.
  • The System.register format was designed to support ES6 modules within ES5.
  • The Universal Module Definition (UMD) format can be used both in the browser and in Node.js. It’s useful when a module needs to be imported by a number of different module loaders.
Requiring a Module

Node.js comes with a set of built-in modules that we can use in our code without having to install them. To do this, we need to require the module using the require keyword and assign the result to a variable. This can then be used to invoke any methods the module exposes.

For example, to list out the contents of a directory, you can use the file system module and its readdir method:

const fs = require('fs');
const folderPath = '/home/jim/Desktop/';

fs.readdir(folderPath, (err, files) => {
  files.forEach(file => {

Note that in CommonJS, modules are loaded synchronously and processed in the order they occur.

Creating and Exporting a Module

Now let’s look at how to create our own module and export it for use elsewhere in our program. Start off by creating a user.js file and adding the following:

const getName = () => {
  return 'Jim';

exports.getName = getName;

Now create an index.js file in the same folder and add this:

const user = require('./user');
console.log(`User: ${user.getName()}`);

Run the program using node index.js and you should see the following output to the terminal:

User: Jim

So what has gone on here? Well, if you look at the user.js file, you’ll notice that we’re defining a getName function, then using the exports keyword to make it available for import elsewhere. Then in the index.js file, we’re importing this function and executing it. Also notice that in the require statement, the module name is prefixed with ./, as it’s a local file. Also note that there’s no need to add the file extension.

Exporting Multiple Methods and Values

We can export multiple methods and values in the same way:

const getName = () => {
  return 'Jim';

const getLocation = () => {
  return 'Munich';

const dateOfBirth = '12.01.1982';

exports.getName = getName;
exports.getLocation = getLocation;
exports.dob = dateOfBirth;

And in index.js:

const user = require('./user');
  `${user.getName()} lives in ${user.getLocation()} and was born on ${user.dob}.`

The code above produces this:

Jim lives in Munich and was born on 12.01.1982.

Notice how the name we give the exported dateOfBirth variable can be anything we fancy (dob in this case). It doesn’t have to be the same as the original variable name.

Variations in Syntax

I should also mention that it’s possible to export methods and values as you go, not just at the end of the file.

For example:

exports.getName = () => {
  return 'Jim';

exports.getLocation = () => {
  return 'Munich';

exports.dob = '12.01.1982';

And thanks to destructuring assignment, we can cherry-pick what we want to import:

const { getName, dob } = require('./user');
  `${getName()} was born on ${dob}.`

As you might expect, this logs:

Jim was born on 12.01.1982.

Exporting a Default Value

In the above example, we’re exporting functions and values individually. This is handy for helper functions that could be needed all over an app, but when you have a module that exports just the one thing, it’s more common to use module.exports:

class User {
  constructor(name, age, email) { = name;
    this.age = age; = email;

  getUserStats() {
    return `
      Name: ${}
      Age: ${this.age}
      Email: ${}

module.exports = User;

And in index.js:

const User = require('./user');
const jim = new User('Jim', 37, '[email protected]');


The code above logs this:

Name: Jim
Age: 37
Email: [email protected]

What’s the Difference Between module.exports and exports?

In your travels across the Web, you might come across the following syntax:

module.exports = {
  getName: () => {
    return 'Jim';

  getLocation: () => {
    return 'Munich';

  dob: '12.01.1982',

Here we’re assigning the functions and values we want to export to an exports property on module — and of course, this works just fine:

const { getName, dob } = require('./user');
  `${getName()} was born on ${dob}.`

This logs the following:

Jim was born on 12.01.1982.

So what is the difference between module.exports and exports? Is one just a handy alias for the other?

Well, kinda, but not quite …

To illustrate what I mean, let’s change the code in index.js to log the value of module:


This produces:

Module {
  id: '.',
  exports: {},
  parent: null,
  filename: '/home/jim/Desktop/index.js',
  loaded: false,
  children: [],
   [ '/home/jim/Desktop/node_modules',
     '/node_modules' ] }

As you can see, module has an exports property. Let’s add something to it:

// index.js = 'foo';

This outputs:

Module {
  id: '.',
  exports: { foo: 'foo' },

Assigning properties to exports also adds them to module.exports. This is because (initially, at least) exports is a reference to module.exports.

So Which One Should I use?

As module.exports and exports both point to the same object, it doesn’t normally matter which you use. For example: = 'foo'; = 'bar';

This code would result in the module’s exported object being { foo: 'foo', bar: 'bar' }.

However, there is a caveat. Whatever you assign module.exports to is what’s exported from your module.

So, take the following: = 'foo';
module.exports = () => { console.log('bar'); };

This would only result in an anonymous function being exported. The foo variable would be ignored.


Modules have become an integral part of the JavaScript ecosystem, allowing us to compose large programs out of smaller parts. I hope this article has given you a good introduction to working with them in Node.js, as well as helping to demystify their syntax.

How to Create PDF Document in Node.js Application?

How to Create PDF Document in Node.js Application?

This Node.js tutorial explains how to create PDF document in Node.js Application. How to create PDF file in Node.js

This Node.js tutorial explains how to create PDF document in Node.js Application. How to create PDF file in Node.js

Node.js for Beginners - Learn Node.js from Scratch (Step by Step)

Node.js for Beginners - Learn Node.js from Scratch (Step by Step)

Node.js for Beginners - Learn Node.js from Scratch (Step by Step) - Learn the basics of Node.js. This Node.js tutorial will guide you step by step so that you will learn basics and theory of every part. Learn to use Node.js like a professional. You’ll learn: Basic Of Node, Modules, NPM In Node, Event, Email, Uploading File, Advance Of Node.

Node.js for Beginners

Learn Node.js from Scratch (Step by Step)

Welcome to my course "Node.js for Beginners - Learn Node.js from Scratch". This course will guide you step by step so that you will learn basics and theory of every part. This course contain hands on example so that you can understand coding in Node.js better. If you have no previous knowledge or experience in Node.js, you will like that the course begins with Node.js basics. otherwise if you have few experience in programming in Node.js, this course can help you learn some new information . This course contain hands on practical examples without neglecting theory and basics. Learn to use Node.js like a professional. This comprehensive course will allow to work on the real world as an expert!
What you’ll learn:

  • Basic Of Node
  • Modules
  • NPM In Node
  • Event
  • Email
  • Uploading File
  • Advance Of Node

What's Node.js Streams and How to Work with them?

What's Node.js Streams and How to Work with them?

In this Node.js tutorial, we'll learn what Node.js Streams is and how to work with them. Understanding Streams in Node.js: Streams in Node.js have a reputation for being hard to work with, and even harder to understand.

Streams in Node.js have a reputation for being hard to work with, and even harder to understand.

In the words of Dominic Tarr: “Streams are Node’s best and most misunderstood idea.” Even Dan Abramov, creator of Redux and core team member of React.js is afraid of Node streams.

This article will help you understand streams and how to work with them. So, don’t be afraid. We can figure this out!

What are streams?

Streams are one of the fundamental concepts that power Node.js applications. They are data-handling method and are used to read or write input into output sequentially.

Streams are a way to handle reading/writing files, network communications, or any kind of end-to-end information exchange in an efficient way.

What makes streams unique, is that instead of a program reading a file into memory all at once like in the traditional way, streams read chunks of data piece by piece, processing its content without keeping it all in memory.

This makes streams really powerful when working with large amounts of data, for example, a file size can be larger than your free memory space, making it impossible to read the whole file into the memory in order to process it. That’s where streams come to the rescue!

Using streams to process smaller chunks of data, makes it possible to read larger files.

Let’s take a “streaming” services such as YouTube or Netflix for example: these services don’t make you download the video and audio feed all at once. Instead, your browser receives the video as a continuous flow of chunks, allowing the recipients to start watching and/or listening almost immediately.

However, streams are not only about working with media or big data. They also give us the power of ‘composability’ in our code. Designing with composability in mind means several components can be combined in a certain way to produce the same type of result. In Node.js it’s possible to compose powerful pieces of code by piping data to and from other smaller pieces of code, using streams.

Why streams

Streams basically provide two major advantages compared to other data handling methods:

  1. Memory efficiency: you don’t need to load large amounts of data in memory before you are able to process it
  2. Time efficiency: it takes significantly less time to start processing data as soon as you have it, rather than having to wait with processing until the entire payload has been transmitted
There are 4 types of streams in Node.js:
  1. Writable: streams to which we can write data. For example, fs.createWriteStream() lets us write data to a file using streams.
  2. Readable: streams from which data can be read. For example: fs.createReadStream() lets us read the contents of a file.
  3. Duplex: streams that are both Readable and Writable. For example, net.Socket
  4. Transform: streams that can modify or transform the data as it is written and read. For example, in the instance of file-compression, you can write compressed data and read decompressed data to and from a file.

If you have already worked with Node.js, you may have come across streams. For example, in a Node.js based HTTP server, request is a readable stream and response is a writable stream. You might have used the fs module, which lets you work with both readable and writable file streams. Whenever you’re using Express you are using streams to interact with the client, also, streams are being used in every database connection driver that you can work with, because of TCP sockets, TLS stack and other connections are all based on Node.js streams.

A practical example

How to create a readable stream

We first require the Readable stream, and we initialize it.

const Stream = require('stream')
const readableStream = new Stream.Readable()

Now that the stream is initialized, we can send data to it:


async iterator

It’s highly recommended to use async iterator when working with streams. According to Dr. Axel Rauschmayer, Asynchronous iteration is a protocol for retrieving the contents of a data container asynchronously (meaning the current “task” may be paused before retrieving an item). Also, it’s important to mention that the stream async iterator implementation use the ‘readable’ event inside.

You can use async iterator when reading from readable streams:

import * as fs from 'fs';

async function logChunks(readable) {
  for await (const chunk of readable) {

const readable = fs.createReadStream(
  'tmp/test.txt', {encoding: 'utf8'});

// Output:
// 'This is a test!\n'

It’s also possible to collect the contents of a readable stream in a string:

import {Readable} from 'stream';

async function readableToString2(readable) {
  let result = '';
  for await (const chunk of readable) {
    result += chunk;
  return result;

const readable = Readable.from('Good morning!', {encoding: 'utf8'});
assert.equal(await readableToString2(readable), 'Good morning!');

Note that, in this case, we had to use an async function because we wanted to return a Promise.

It’s important to keep in mind to not mix async functions with EventEmitter because currently, there is no way to catch a rejection when it is emitted within an event handler, causing hard to track bugs and memory leaks. The best current practice is to always wrap the content of an async function in a try/catch block and handle errors, but this is error prone. This pull request aims to solve this issue once it lands on Node core.

Readable.from(): Creating readable streams from iterables

stream.Readable.from(iterable, [options]) it’s a utility method for creating Readable Streams out of iterators, which holds the data contained in iterable. Iterable can be a synchronous iterable or an asynchronous iterable. The parameter options is optional and can, among other things, be used to specify a text encoding.

const { Readable } = require('stream');

async function * generate() {
  yield 'hello';
  yield 'streams';

const readable = Readable.from(generate());

readable.on('data', (chunk) => {

Two Reading Modes

According to Streams API, readable streams effectively operate in one of two modes: flowing and paused. A Readable stream can be in object mode or not, regardless of whether it is in flowing mode or paused mode.

  • In flowing mode, data is read from the underlying system automatically and provided to an application as quickly as possible using events via the EventEmitter interface.

  • In paused mode, the method must be called explicitly to read chunks of data from the stream.

In a flowing mode, to read data from a stream, it’s possible to listen to data event and attach a callback. When a chunk of data is available, the readable stream emits a data event and your callback executes. Take a look at the following snippet:

var fs = require("fs");
var data = '';

var readerStream = fs.createReadStream('file.txt'); //Create a readable stream

readerStream.setEncoding('UTF8'); // Set the encoding to be utf8\. 

// Handle stream events --> data, end, and error
readerStream.on('data', function(chunk) {
   data += chunk;

readerStream.on('end',function() {

readerStream.on('error', function(err) {

console.log("Program Ended");

The function call fs.createReadStream() gives you a readable stream. Initially, the stream is in a static state. As soon as you listen to data event and attach a callback it starts flowing. After that, chunks of data are read and passed to your callback. The stream implementor decides how often a data event is emitted. For example, an HTTP request may emit a data event once every few KBs of data are read. When you are reading data from a file you may decide you emit a data event once a line is read.

When there is no more data to read (end is reached), the stream emits an end event. In the above snippet, we listen to this event to get notified when the end is reached.

Also, if there is an error, the stream will emit and notify the error.

In paused mode, you just need to call read() on the stream instance repeatedly until every chunk of data has been read, like in the following example:

var fs = require('fs');
var readableStream = fs.createReadStream('file.txt');
var data = '';
var chunk;

readableStream.on('readable', function() {
    while (( != null) {
        data += chunk;

readableStream.on('end', function() {

The read() function reads some data from the internal buffer and returns it. When there is nothing to read, it returns null. So, in the while loop, we check for null and terminate the loop. Note that the readable event is emitted when a chunk of data can be read from the stream.

All Readable streams begin in paused mode but can be switched to flowing mode in one of the following ways:

  • Adding a 'data' event handler.
  • Calling the stream.resume() method.
  • Calling the stream.pipe() method to send the data to a Writable.

The Readable can switch back to paused mode using one of the following:

  • If there are no pipe destinations, by calling the stream.pause() method.
  • If there are pipe destinations, by removing all pipe destinations. Multiple pipe destinations may be removed by calling the stream.unpipe() method.

The important concept to remember is that a Readable will not generate data until a mechanism for either consuming or ignoring that data is provided. If the consuming mechanism is disabled or taken away, the Readable will attempt to stop generating the data.
Adding a readable event handler automatically make the stream to stop flowing, and the data to be consumed via If the 'readable' event handler is removed, then the stream will start flowing again if there is a 'data' event handler.

How to create a writable stream

To write data to a writable stream you need to call write() on the stream instance. Like in the following example:

var fs = require('fs');
var readableStream = fs.createReadStream('file1.txt');
var writableStream = fs.createWriteStream('file2.txt');


readableStream.on('data', function(chunk) {

The above code is straightforward. It simply reads chunks of data from an input stream and writes to the destination using write(). This function returns a boolean value indicating if the operation was successful. If true, then the write was successful and you can keep writing more data. If false is returned, it means something went wrong and you can’t write anything at the moment. The writable stream will let you know when you can start writing more data by emitting a drain event.

Calling the writable.end() method signals that no more data will be written to the Writable. If provided, the optional callback function is attached as a listener for the 'finish' event.

// Write 'hello, ' and then end with 'world!'.
const fs = require('fs');
const file = fs.createWriteStream('example.txt');
file.write('hello, ');
// Writing more now is not allowed!

Using a writable stream you can read data from a readable stream:

const Stream = require('stream')

const readableStream = new Stream.Readable()
const writableStream = new Stream.Writable()

writableStream._write = (chunk, encoding, next) => {




You can also use async iterators to write to a writable stream, which is recommended

import * as util from 'util';
import * as stream from 'stream';
import * as fs from 'fs';
import {once} from 'events';

const finished = util.promisify(stream.finished); // (A)

async function writeIterableToFile(iterable, filePath) {
  const writable = fs.createWriteStream(filePath, {encoding: 'utf8'});
  for await (const chunk of iterable) {
    if (!writable.write(chunk)) { // (B)
      // Handle backpressure
      await once(writable, 'drain');
  writable.end(); // (C)
  // Wait until done. Throws if there are errors.
  await finished(writable);

await writeIterableToFile(
  ['One', ' line of text.\n'], 'tmp/log.txt');
  fs.readFileSync('tmp/log.txt', {encoding: 'utf8'}),
  'One line of text.\n');

The default version of stream.finished() is callback-based but can be turned into a Promise-based version via util.promisify() (line A).

In this example, it is used the following two patterns:

Writing to a writable stream while handling backpressure (line B):

if (!writable.write(chunk)) {
  await once(writable, 'drain');

Closing a writable stream and waiting until writing is done (line C):

await finished(writable);


Piping is a mechanism where we provide the output of one stream as the input to another stream. It is normally used to get data from one stream and to pass the output of that stream to another stream. There is no limit on piping operations. In other words, piping is used to process streamed data in multiple steps.

In Node 10.x was introduced stream.pipeline(). This is a module method to pipe between streams forwarding errors and properly cleaning up and provide a callback when the pipeline is complete.

Here is an example of using pipeline:

const { pipeline } = require('stream');
const fs = require('fs');
const zlib = require('zlib');

// Use the pipeline API to easily pipe a series of streams
// together and get notified when the pipeline is fully done.
// A pipeline to gzip a potentially huge video file efficiently:

  (err) => {
    if (err) {
      console.error('Pipeline failed', err);
    } else {
      console.log('Pipeline succeeded');

pipeline should be used instead of pipe, as pipe is unsafe.

The Stream Module

The Node.js stream module provides the foundation upon which all streaming APIs are build.

The Stream module is a native module that shipped by default in Node.js. The Stream is an instance of the EventEmitter class which handles events asynchronously in Node. Because of this, streams are inherently event-based.

To access the stream module:

const stream = require('stream');

The stream module is useful for creating new types of stream instances. It is usually not necessary to use the stream module to consume streams.

Streams-powered Node APIs

Due to their advantages, many Node.js core modules provide native stream handling capabilities, most notably:

  • net.Socket is the main node api that is stream are based on, which underlies most of the following APIs
  • process.stdin returns a stream connected to stdin
  • process.stdout returns a stream connected to stdout
  • process.stderr returns a stream connected to stderr
  • fs.createReadStream() creates a readable stream to a file
  • fs.createWriteStream() creates a writable stream to a file
  • net.connect() initiates a stream-based connection
  • http.request() returns an instance of the http.ClientRequest class, which is a writable stream
  • zlib.createGzip() compress data using gzip (a compression algorithm) into a stream
  • zlib.createGunzip() decompress a gzip stream.
  • zlib.createDeflate() compress data using deflate (a compression algorithm) into a stream
  • zlib.createInflate() decompress a deflate stream

Streams Cheat Sheet:

Here are some important events related to writable streams:

  • error – Emitted to indicate that an error has occurred while writing/piping.
  • pipeline – When a readable stream is piped into a writable stream, this event is emitted by the writable stream.
  • unpipe – Emitted when you call unpipe on the readable stream and stop it from piping into the destination stream.

This was all about the basics of streams. Streams, pipes, and chaining are the core and most powerful features in Node.js. Streams can indeed help you write neat and performant code to perform I/O.

Also, there is a Node.js strategic initiative worth looking to, called BOB, aiming to improve Node.js streaming data interfaces, both within Node.js core internally, and hopefully also as future public APIs.