Using a Java Based Kafka Client in a Node.js Application

Using a Java Based Kafka Client in a Node.js Application

A step by step guide for developing a Java based Kafka client in a Node.js application using GraalVM.

A step by step guide for developing a Java based Kafka client in a Node.js application using GraalVM.

The first time I heard about GraalVM, it totally blew my mind. Being able to combine multiple languages in a single application or business logic is an incredibly useful and powerful tool.

A real life need for a polyglot application emerged once we decided to switch from RabbitMQ to Kafka as our messaging system. Most of our RMQ consumers were written in Node.js, and moving to a different messaging system would force us either use a Node.js based library, or rewrite our entire business logic.

While there are several Node.js based Kafka clients, using them poses limitations such as the implemented Kafka API version, or the exposed interfaces and customization options. Using a Native Kafka client while maintaining the Node.js business logic would be a real win for us.

This tutorial builds on this awesome medium post on developing with Java and JavaScript together using GraalVM.

We will be using Docker Compose to build and create our images.

A working example can be found here.

Setting up Docker

The minimal needs of our environment are having GraalVM, Zookeeper and Kafka installed. The quickest way to achieve this is by using Docker and Docker Compose to create a complete running environment:

version: '3.3'
services:
  zookeeper:
    image: 'confluentinc/cp-zookeeper:5.0.0'
    hostname: zookeeper
    ports:
      - '2181:2181'
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000
    volumes:
      - zk-data:/var/lib/zookeeper/data
      - zk-log:/var/lib/zookeeper/log
  kafka-broker:
    image: 'confluentinc/cp-kafka:5.0.0'
    ports:
      - '9092:9092'
      - '9093:9093'
    depends_on:
      - 'zookeeper'
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181'
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092,PLAINTEXT2://kafka-broker:9093
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT, PLAINTEXT2:PLAINTEXT
      KAFKA_TOPICS: "test_topic"
  graalvm:
    image: 'oracle/graalvm-ce:1.0.0-rc12'
    depends_on:
      - 'kafka-broker'
    volumes:
      - ./:/code
    environment:
      VM: 'graalvm'
      

volumes:
  zk-data:
  zk-log:

docker-compose.yml hosted with ❤ by GitHub

A Docker Compose file containing definitions for zookeeper, Kafka and GraalVM.

Running docker-compose up -d from the containing folder will perform the following:

  1. Download a Zookeeper image and run it on port 2181 along with persistent data and log volumes.
  2. Download a Kafka image containing a Kafka broker, and run it. The broker will connect to a Zookeeper on port 2181, and will allow client connections on ports 9092 and 9093.
  3. Download a GraalVM image. This image will have GraalVM and Node.js installed, and will have a shared volume with the host machine in the ./code folder.

All defined ports will be exposed on the local machine (localhost:port). Also, services will recognize each other based on their server name. Accessing Zookeeper from the broker machine will be using zookeeper:2181 as the host name. Same for kafka-broker:9092 for connecting with the Kafka broker.

Setting up the Java client

We are going to be using Java 1.8 and Maven to compile and run our Java client.

Even though the entire Kafka client will reside in a container, it will be helpful to run and debug our code directly from the host machine, using our favorite IDE. To do that, Maven and Java need to be installed on the host machine. Connection to other containers will be done using localhost as the host name.

Setting up Maven

You can use this tutorial to start a new Maven based Java project, or just use the following pom file:

<?xml version="1.0" encoding="UTF-8"?>

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>your.group.id</groupId>
  <artifactId>kafka-client</artifactId>
  <version>1.0</version>

  <name>kafka-client</name>
  <!-- FIXME change it to the project's website -->
  <url>http://www.example.com</url>

  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
  </properties>

  <dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.11</version>
      <scope>test</scope>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients -->
    <dependency>
      <groupId>org.apache.kafka</groupId>
      <artifactId>kafka-clients</artifactId>
      <version>2.1.0</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-simple -->
    <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-simple</artifactId>
      <version>1.7.25</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.json/json -->
    <dependency>
      <groupId>org.json</groupId>
      <artifactId>json</artifactId>
      <version>20180813</version>
    </dependency>
  </dependencies>

  <build>
    <plugins>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-shade-plugin</artifactId>
        <executions>
          <execution>
            <phase>package</phase>
            <goals>
              <goal>shade</goal>
            </goals>
          </execution>
        </executions>
        <configuration>
          <finalName>uber-${project.artifactId}-${project.version}</finalName>
        </configuration>
      </plugin>
    </plugins>
            
  </build>
</project>

pom.xml hosted with ❤ by GitHub

The above pom file will create the required Java application file structure, along with all the required dependencies.

Notice the ‘maven-shade-plugin’ we are using to compile a single ‘uber-jar’ for the client and all of its dependencies. This will make it easier for us to add the client to the Node.js application later.

Make sure to change your.group.id to your desired package name.

Creating a Kafka client

Next step is creating our Kafka client (consumer and producer).

We will implement a basic Kafka producer and then a consumer.

Add a Producer.java file under /src/main/java/my/group/id:

package my.package.id;

import org.apache.kafka.clients.producer.*;
import org.apache.kafka.common.serialization.StringSerializer;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.Iterator;
import org.json.*;
import java.util.Properties;
import java.util.concurrent.ExecutionException;

public class Producer {

    public static void main(String[] args) {
        Producer p = new Producer("{\"bootstrap.servers\": \"localhost:9092\", }");
        try {
            p.put("test_topic", "msgKey", "msgData");
        }
        catch (Exception e) {
            System.out.println("Error Putting" + e);
        }
    }

    private Properties produceProperties;
    private final KafkaProducer<String, String> mProducer;
    private final Logger mLogger = LoggerFactory.getLogger(Producer.class);

    public Producer(String config) {
        extractPropertiesFromJson(config);
        mProducer = new KafkaProducer<>(produceProperties);

        mLogger.info("Producer initialized");
    }

    public void put(String topic, String key, String value) throws ExecutionException, InterruptedException {
        mLogger.info("Put value: " + value + ", for key: " + key);

        ProducerRecord<String, String> record = new ProducerRecord<>(topic, key, value);
        mProducer.send(record, (recordMetadata, e) -> {
        if (e != null) {
            mLogger.error("Error while producing", e);
            return;
        }

        mLogger.info("Received new meta. Topic: " + recordMetadata.topic()
            + "; Partition: " + recordMetadata.partition()
            + "; Offset: " + recordMetadata.offset()
            + "; Timestamp: " + recordMetadata.timestamp());
        }).get();
    }

    void close() {
        mLogger.info("Closing producer's connection");
        mProducer.close();
    }

    private void extractPropertiesFromJson(String jsonString) {
        produceProperties = new Properties();
        JSONObject jsonObject = new JSONObject(jsonString.trim());
        Iterator<String> keys = jsonObject.keys();
        while(keys.hasNext()) {
            String key = keys.next();
            produceProperties.setProperty(key, (String)jsonObject.get(key));
        }
        String deserializer = StringSerializer.class.getName();
        produceProperties.setProperty(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, deserializer);
        produceProperties.setProperty(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, deserializer);
    }
}

Producer.java hosted with ❤ by GitHub

The producer in the example above can receive its configuration in a JSON format, and sends a string type message.

The main function in the Producer is an easy way of running the code and sending a test message.

Add a Consumer.java file in the same folder:

package my.package.id;

import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.consumer.ConsumerRecords;
import org.apache.kafka.clients.consumer.KafkaConsumer;
import org.apache.kafka.common.errors.WakeupException;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.Properties;
import java.util.Collections;
import java.util.Iterator;
import java.time.Duration;
import java.util.concurrent.CountDownLatch;
import org.json.*;
import java.util.Queue; 

public class Consumer {

    // a concurrent queue shared with Node
    private final Queue<Object> mQueue;     
    private Properties consumProperties;
    private final Logger mLogger = LoggerFactory.getLogger(Consumer.class.getName());
    
    public Consumer(Queue<Object> queue, String config){
      mQueue = queue;
      extractPropertiesFromJson(config);
    }

    public void start() {
        CountDownLatch latch = new CountDownLatch(1);

        ConsumerRunnable consumerRunnable = new ConsumerRunnable(consumProperties, latch, mQueue);
        Thread thread = new Thread(consumerRunnable);
        thread.start();
    
        Runtime.getRuntime().addShutdownHook(new Thread(() -> {
            mLogger.info("Caught shutdown hook");
            consumerRunnable.shutdown();
            await(latch);

            mLogger.info("Application has exited");
        }));
    }

    private void await(CountDownLatch latch) {
        try {
          latch.await();
        } catch (InterruptedException e) {
          mLogger.error("Application got interrupted", e);
        } finally {
          mLogger.info("Application is closing");
        }
      }
    
    private void extractPropertiesFromJson(String jsonString) {
        consumProperties = new Properties();
        JSONObject jsonObject = new JSONObject(jsonString.trim());
        Iterator<String> keys = jsonObject.keys();
        while(keys.hasNext()) {
            String key = keys.next();
            consumProperties.setProperty(key, (String)jsonObject.get(key));
        }
        String deserializer = StringDeserializer.class.getName();
        consumProperties.setProperty(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, deserializer);
        consumProperties.setProperty(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, deserializer);
    }

    private class ConsumerRunnable implements Runnable {

        private KafkaConsumer<String, String> mConsumer;
        private CountDownLatch mLatch;
        private Queue mQueue;

        ConsumerRunnable(Properties config, CountDownLatch latch, Queue queue) {
            mLatch = latch;
            mQueue = queue;
            String topic = (String)config.get("topic");
            config.remove("topic");
            mConsumer = new KafkaConsumer<>(config);
            mConsumer.subscribe(Collections.singletonList(topic));
        }

        @Override
        public void run() {
          try {
            while (true) {
              ConsumerRecords<String, String> records = mConsumer.poll(Duration.ofMillis(100));
    
              for (ConsumerRecord<String, String> record : records) {
                mLogger.info("Key: " + record.key() + ", Value: " + record.value());
                mLogger.info("Partition: " + record.partition() + ", Offset: " + record.offset());
                mQueue.offer(record);
              }
            }
          } catch (WakeupException e) {
            mLogger.info("Received shutdown signal!");
          } finally {
            mConsumer.close();
            mLatch.countDown();
          }
        }

        public void shutdown() {
            mConsumer.wakeup();
        }
    }
}

Consumer.java hosted with ❤ by GitHub

Same as the producer, this consumer receives its configuration in a JSON format.

After configuring our consumer, we start a new thread that connects to our Kafka broker and polls for messages. Each new message is pushed into a queue which will later be used in our Node.js application.

Compiling the code

Running mvn package form within the root folder, will compile the code into a single jar file named ‘uber-kafka-client-1.0.jar’. This file contains all required java code and dependencies, and will be used as a java library.

Setting up a Node.js Application

Last but not least is our Node.js application.

Add an index.js file under node/services/kafka-user:

const {Worker} = require('worker_threads');

function JavaToJSNotifier() {
    this.queue = new java.util.concurrent.LinkedBlockingDeque();
    this.worker = new Worker(`
        const { workerData, parentPort } = require('worker_threads');
        while (true) {
          // block the worker waiting for the next notification from Java
          var data = workerData.queue.take();
          // notify the main event loop that we got new data 
          parentPort.postMessage(data);
        }`,
        { eval: true, workerData: { queue: this.queue }, stdout: true, stderr: true });
}

const config = {
    "bootstrap.servers": (process.env.VM === 'graalvm') ?'kafka-broker:9093' : 'localhost:9092'
}

const Consumer = Java.type('my.package.id.Consumer');
config.topic = "test_topic";
config['group.id'] = 'Test_Group'

const asyncJavaEvents = new JavaToJSNotifier();
asyncJavaEvents.worker.on('message', (n) => {
    console.log(`Got new data from Java! ${n}`);
});

const mConsumer = new Consumer(asyncJavaEvents.queue, JSON.stringify(config));
mConsumer.start();

index.js hosted with ❤ by GitHub

The code above creates and configures a new Kafka consumer, and then uses node’s experimental workers to create a new thread that listens to messages from that consumer. The consumer thread notifies the main thread when a new message arrives.

Notice the this.queue = new java.util.concurrent.LinkedBlockingDeque()on line 4. This is possible due to using the GraalVM image. This queue will be a shared instance with the Java consumer we previously defined.

Also, notice the const Consumer = Java.type('my.pakcage.id.Consumer')in line 20. Again this is possible due to GraalVM, and will hold a reference to our Java based Kafka consumer.

Running the code

The previously installed GraalVM image already contains node and GraalVM setup. If one wishes to run the node application on the host machine instead, installing and configuring GraalVM is required (instructions).

To run our code inside the container, open a terminal from the root folder and type docker-compose run graalvm sh.

This will open a shell within the GraalVM image.

Due to our configuration all of our compiled code and scripts will be located under the ./code folder.

Run the following command:

node --polyglot --jvm --jvm.cp=code/target/uber-kafka-client-1.0.jar -- experimental-worker code/node/services/kafka-user/index.js

This command will run our node application as a polyglot application in a JVM. Notice the — jvm.cp parameter that tells JVM where to find our Java based Kafka client.

Trying it out

Keep the terminal open, go back to the Java IDE, and run the Producer.main procedure.

You should now see the following printed in you terminal:

Success!!

Summary

GraalVM makes writing polyglot applications easy. Adding a docker infrastructure, makes it even easier to develop and run cross-language applications just about anywhere.

The possibilities are virtually endless.

I hope this helps some of you and maybe inspires you to create some cross-language solutions to a real life problem you are facing.

How to Use Express.js, Node.js and MongoDB.js

How to Use Express.js, Node.js and MongoDB.js

In this post, I will show you how to use Express.js, Node.js and MongoDB.js. We will be creating a very simple Node application, that will allow users to input data that they want to store in a MongoDB database. It will also show all items that have been entered into the database.

In this post, I will show you how to use Express.js, Node.js and MongoDB.js. We will be creating a very simple Node application, that will allow users to input data that they want to store in a MongoDB database. It will also show all items that have been entered into the database.

Creating a Node Application

To get started I would recommend creating a new database that will contain our application. For this demo I am creating a directory called node-demo. After creating the directory you will need to change into that directory.

mkdir node-demo
cd node-demo

Once we are in the directory we will need to create an application and we can do this by running the command
npm init

This will ask you a series of questions. Here are the answers I gave to the prompts.

The first step is to create a file that will contain our code for our Node.js server.

touch app.js

In our app.js we are going to add the following code to build a very simple Node.js Application.

var express = require("express");
var app = express();
var port = 3000;
 
app.get("/", (req, res) => {
&nbsp;&nbsp;res.send("Hello World");
});
 
app.listen(port, () => {
  console.log("Server listening on port " + port);
});

What the code does is require the express.js application. It then creates app by calling express. We define our port to be 3000.

The app.use line will listen to requests from the browser and will return the text “Hello World” back to the browser.

The last line actually starts the server and tells it to listen on port 3000.

Installing Express

Our app.js required the Express.js module. We need to install express in order for this to work properly. Go to your terminal and enter this command.

npm install express --save

This command will install the express module into our package.json. The module is installed as a dependency in our package.json as shown below.

To test our application you can go to the terminal and enter the command

node app.js

Open up a browser and navigate to the url http://localhost:3000

You will see the following in your browser

Creating Website to Save Data to MongoDB Database

Instead of showing the text “Hello World” when people view your application, what we want to do is to show a place for user to save data to the database.

We are going to allow users to enter a first name and a last name that we will be saving in the database.

To do this we will need to create a basic HTML file. In your terminal enter the following command to create an index.html file.

touch index.html

In our index.html file we will be creating an input filed where users can input data that they want to have stored in the database. We will also need a button for users to click on that will add the data to the database.

Here is what our index.html file looks like.

<!DOCTYPE html>
<html>
  <head>
    <title>Intro to Node and MongoDB<title>
  <head>

  <body>
    <h1>Into to Node and MongoDB<&#47;h1>
    <form method="post" action="/addname">
      <label>Enter Your Name<&#47;label><br>
      <input type="text" name="firstName" placeholder="Enter first name..." required>
      <input type="text" name="lastName" placeholder="Enter last name..." required>
      <input type="submit" value="Add Name">
    </form>
  <body>
<html>

If you are familiar with HTML, you will not find anything unusual in our code for our index.html file. We are creating a form where users can input their first name and last name and then click an “Add Name” button.

The form will do a post call to the /addname endpoint. We will be talking about endpoints and post later in this tutorial.

Displaying our Website to Users

We were previously displaying the text “Hello World” to users when they visited our website. Now we want to display our html file that we created. To do this we will need to change the app.use line our our app.js file.

We will be using the sendFile command to show the index.html file. We will need to tell the server exactly where to find the index.html file. We can do that by using a node global call __dirname. The __dirname will provide the current directly where the command was run. We will then append the path to our index.html file.

The app.use lines will need to be changed to
app.use("/", (req, res) => {   res.sendFile(__dirname + "/index.html"); });

Once you have saved your app.js file, we can test it by going to terminal and running node app.js

Open your browser and navigate to “http://localhost:3000”. You will see the following

Connecting to the Database

Now we need to add our database to the application. We will be connecting to a MongoDB database. I am assuming that you already have MongoDB installed and running on your computer.

To connect to the MongoDB database we are going to use a module called Mongoose. We will need to install mongoose module just like we did with express. Go to your terminal and enter the following command.
npm install mongoose --save

This will install the mongoose model and add it as a dependency in our package.json.

Connecting to the Database

Now that we have the mongoose module installed, we need to connect to the database in our app.js file. MongoDB, by default, runs on port 27017. You connect to the database by telling it the location of the database and the name of the database.

In our app.js file after the line for the port and before the app.use line, enter the following two lines to get access to mongoose and to connect to the database. For the database, I am going to use “node-demo”.

var mongoose = require("mongoose"); mongoose.Promise = global.Promise; mongoose.connect("mongodb://localhost:27017/node-demo");

Creating a Database Schema

Once the user enters data in the input field and clicks the add button, we want the contents of the input field to be stored in the database. In order to know the format of the data in the database, we need to have a Schema.

For this tutorial, we will need a very simple Schema that has only two fields. I am going to call the field firstName and lastName. The data stored in both fields will be a String.

After connecting to the database in our app.js we need to define our Schema. Here are the lines you need to add to the app.js.
var nameSchema = new mongoose.Schema({   firstName: String,   lastNameName: String });

Once we have built our Schema, we need to create a model from it. I am going to call my model “DataInput”. Here is the line you will add next to create our mode.
var User = mongoose.model("User", nameSchema);

Creating RESTful API

Now that we have a connection to our database, we need to create the mechanism by which data will be added to the database. This is done through our REST API. We will need to create an endpoint that will be used to send data to our server. Once the server receives this data then it will store the data in the database.

An endpoint is a route that our server will be listening to to get data from the browser. We already have one route that we have created already in the application and that is the route that is listening at the endpoint “/” which is the homepage of our application.

HTTP Verbs in a REST API

The communication between the client(the browser) and the server is done through an HTTP verb. The most common HTTP verbs are
GET, PUT, POST, and DELETE.

The following table explains what each HTTP verb does.

HTTP Verb Operation
GET Read
POST Create
PUT Update
DELETE Delete

As you can see from these verbs, they form the basis of CRUD operations that I talked about previously.

Building a CRUD endpoint

If you remember, the form in our index.html file used a post method to call this endpoint. We will now create this endpoint.

In our previous endpoint we used a “GET” http verb to display the index.html file. We are going to do something very similar but instead of using “GET”, we are going to use “POST”. To get started this is what the framework of our endpoint will look like.

app.post("/addname", (req, res) => {
 
});
Express Middleware

To fill out the contents of our endpoint, we want to store the firstName and lastName entered by the user into the database. The values for firstName and lastName are in the body of the request that we send to the server. We want to capture that data, convert it to JSON and store it into the database.

Express.js version 4 removed all middleware. To parse the data in the body we will need to add middleware into our application to provide this functionality. We will be using the body-parser module. We need to install it, so in your terminal window enter the following command.

npm install body-parser --save

Once it is installed, we will need to require this module and configure it. The configuration will allow us to pass the data for firstName and lastName in the body to the server. It can also convert that data into JSON format. This will be handy because we can take this formatted data and save it directly into our database.

To add the body-parser middleware to our application and configure it, we can add the following lines directly after the line that sets our port.

var bodyParser = require('body-parser');
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
Saving data to database

Mongoose provides a save function that will take a JSON object and store it in the database. Our body-parser middleware, will convert the user’s input into the JSON format for us.

To save the data into the database, we need to create a new instance of our model that we created early. We will pass into this instance the user’s input. Once we have it then we just need to enter the command “save”.

Mongoose will return a promise on a save to the database. A promise is what is returned when the save to the database completes. This save will either finish successfully or it will fail. A promise provides two methods that will handle both of these scenarios.

If this save to the database was successful it will return to the .then segment of the promise. In this case we want to send text back the user to let them know the data was saved to the database.

If it fails it will return to the .catch segment of the promise. In this case, we want to send text back to the user telling them the data was not saved to the database. It is best practice to also change the statusCode that is returned from the default 200 to a 400. A 400 statusCode signifies that the operation failed.

Now putting all of this together here is what our final endpoint will look like.

app.post("/addname", (req, res) => {
  var myData = new User(req.body);
  myData.save()
    .then(item => {
      res.send("item saved to database");
    })
    .catch(err => {
      res.status(400).send("unable to save to database");
    });
});
Testing our code

Save your code. Go to your terminal and enter the command node app.js to start our server. Open up your browser and navigate to the URL “http://localhost:3000”. You will see our index.html file displayed to you.

Make sure you have mongo running.

Enter your first name and last name in the input fields and then click the “Add Name” button. You should get back text that says the name has been saved to the database like below.

Access to Code

The final version of the code is available in my Github repo. To access the code click here. Thank you for reading !

Hetu Rajgor's answer to What are the major differences between Java, AngularJS, and JavaScript and Node JS? - Quora

<img src="https://moriohcdn.b-cdn.net/70b437cf37.png">Java is a programming language which is owned by Oracle. More than 3 Million devices are running in Java.&nbsp;JS is a client-side programming language used for creating dynamic websites and apps to run in the client's browser.

Java is a programming language which is owned by Oracle. More than 3 Million devices are running in Java. JS is a client-side programming language used for creating dynamic websites and apps to run in the client's browser.

Node.js for Beginners - Learn Node.js from Scratch (Step by Step)

Node.js for Beginners - Learn Node.js from Scratch (Step by Step)

Node.js for Beginners - Learn Node.js from Scratch (Step by Step) - Learn the basics of Node.js. This Node.js tutorial will guide you step by step so that you will learn basics and theory of every part. Learn to use Node.js like a professional. You’ll learn: Basic Of Node, Modules, NPM In Node, Event, Email, Uploading File, Advance Of Node.

Node.js for Beginners

Learn Node.js from Scratch (Step by Step)

Welcome to my course "Node.js for Beginners - Learn Node.js from Scratch". This course will guide you step by step so that you will learn basics and theory of every part. This course contain hands on example so that you can understand coding in Node.js better. If you have no previous knowledge or experience in Node.js, you will like that the course begins with Node.js basics. otherwise if you have few experience in programming in Node.js, this course can help you learn some new information . This course contain hands on practical examples without neglecting theory and basics. Learn to use Node.js like a professional. This comprehensive course will allow to work on the real world as an expert!
What you’ll learn:

  • Basic Of Node
  • Modules
  • NPM In Node
  • Event
  • Email
  • Uploading File
  • Advance Of Node