Understanding Streams in Node.js

Understanding Streams in Node.js

In this post, we take a look at how to use the Node.js runtime environment to help us work with streams of data coming to our application.

In this post, we take a look at how to use the Node.js runtime environment to help us work with streams of data coming to our application.

Node.js is known for its asynchronous nature and has many modules that we use every day in our code, but ever get a chance to dive into any deeper. One of these core modules is streams.

Streams allow us to handle data flow asynchronously. There are two data handling approaches in Nod.js.

  1. Buffered approach: The buffered approach says that a receiver can read the data only if the whole data set is written to the buffer.
  2. Streams approach: In the streams approach, data arrives in chunks and can be read in chunks; this can be a single part of the data.

Types of streams available

  1. Let’s experiment by creating a big file:
const fs = require("fs");
const file = fs.createWriteStream("./big.file");

for (let i = 0; i <= 1e6; i++) {
  file.write(
    "Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.\n"
  );
}

file.end();

We have created a file using the Writable Stream. The fs module in Node.js can be used to read from and write to files using a Stream interface. Running the above code generates a file that’s about ~400 MB.

  1. Let’s read the same big file using the read stream:
const fs = require("fs");
const server = require("http").createServer();

server.on("request", (req, res) => {
  fs.readFile("./big.file", (err, data) => {
    if (err) throw err;

    res.end(data);
  });
});

server.listen(8000);

Optimized Solution for Data Transformation

Time Efficiency

For better efficiency, we can use a great behavior that comes with streams in Node: piping. Basically, you can pipe two streams where the output of one stream is an input to the other.

What happens is the “data” (chunk) arrives at “stream 1” which is “piped to stream 2” which can further be piped to other streams,

With Pipes:

const fs = require("fs");
const server = require("http").createServer();

server.on("request", (req, res) => {
  const src = fs.createReadStream("./big.file");
  src.pipe(res);
});

server.listen(8000);

This is how we can parallelize multiple stages a data chunk might go through. This strategy is called pipelining. Node.js allows us to pipeline our tasks with the help of streams.

Node.js can work on a single thread but this doesn’t mean we can’t do two tasks or processes at a time. This can be done via child processes in Node.js

node-js

What's new in Bootstrap 5 and when Bootstrap 5 release date?

How to Build Progressive Web Apps (PWA) using Angular 9

What is new features in Javascript ES2020 ECMAScript 2020

Deno Crash Course: Explore Deno and Create a full REST API with Deno

How to Build a Real-time Chat App with Deno and WebSockets

Convert HTML to Markdown Online

HTML entity encoder decoder Online

Random Password Generator Online

HTML Color Picker online | HEX Color Picker | RGB Color Picker

How to Hire Node.js Developers And How Much Does It Cost?

A Guide to Hire Node.js Developers who can help you create fast and efficient web applications. Also, know how much does it cost to hire Node.js Developers.

Hands on with Node.Js Streams | Examples & Approach

The practical implications of having Streams in Node.js are vast. Nodejs Streams are a great way to handle data chunks and uncomplicate development.

Node.js Performance: Node.js vs Io.js

You may already be aware that Raygun uses Node.JS for our API nodes that receive your precious crash reporting data (we also do node.js crash reporting if you’re interested). We’ve peaked in the past at more than 110,000 requests per second coming...

Node.js Live | Node.js Docker Tutorial | Dockerizing Node.js App|Node.js Training|Edureka

🔥 Node.js Certification Training: https://www.edureka.co/nodejs-certification-training This Edureka video on 'Node.js Docker Tutorial' will help you in learn...

Node JS Complete Course PDF | Node.js

There are some Features that choose Node.js the foremost decision of programming designers.