General-use HAPI server front-end implemented in node.js

General-use HAPI server front-end implemented in node.js

HAPI Server Front-End General-use HAPI server front-end implemented in node.js

HAPI Server Front-End

A generic HAPI front-end server.

1. About

The intended use for this server-side software is a data provider wants to serve data through a HAPI API. With this software, the data provider only needs

  1. HAPI metadata, in one of a variety of forms, for a collection of datasets and
  2. a command-line program that returns at least headerless HAPI CSV for all parameters in the dataset over the full time range of available data. Optionally, the command line program can take inputs of a start and stop time, a list of one or more parameters to output, and an output format

to be able to serve data from a HAPI API from their server. This software handles

  1. HAPI metadata validation,
  2. request validation and error responses,
  3. logging and alerts,
  4. time and parameter subsetting (as needed), and
  5. generation of HAPI JSON or HAPI binary (as needed).

A list of catalogs that are served using this software is given at http://hapi-server.org/servers.

2. Installation

Binary packages are available for OS-X x64, Linux x64, and Linux ARMv7l (e.g., Rasberry Pi).

A Docker image is also available.

Installation and startup commands are given below the binary packages and docker image. See the Development section for instructions on installing from source.

OS-X x64:

 curl -L https://github.com/hapi-server/server-nodejs/releases/download/v0.9.5/hapi-server-v0.9.5-darwin-x64.tgz | tar zxf -
 cd hapi-server-v0.9.5
 ./hapi-server --open

Linux x64:

 curl -L https://github.com/hapi-server/server-nodejs/releases/download/v0.9.5/hapi-server-v0.9.5-linux-x64.tgz | tar zxf -
 cd hapi-server-v0.9.5
 ./hapi-server --open

Linux ARMv7l:

 curl -L https://github.com/hapi-server/server-nodejs/releases/download/v0.9.5/hapi-server-v0.9.5-linux-armv7l.tgz | tar zxf -
 cd hapi-server-v0.9.5
 ./hapi-server --open

Docker:

docker pull rweigel/hapi-server:v0.9.5
docker run -dit --name hapi-server-v0.9.5 --expose 8999 -p 8999:8999 rweigel/hapi-server:v0.9.5
docker exec -it hapi-server-v0.9.5 ./hapi-server
# Open http://localhost:8999/TestData/hapi in a web browser

2. Examples

List of Included Examples

The following examples are included in the metadata directory. The examples can be run using

./hapi-server -f metadata/FILENAME.json

where FILENAME.json is one of the file names listed below (e.g., Example0.json).

  • Example0.json - A Python program dumps a full dataset in the headerless HAPI CSV format; the server handles time and parameter subsetting and creation of HAPI Binary and JSON. See section 2.1.
  • Example1.json - Same as Example0 except the Python program handles time subsetting.
  • Example2.json - Same as Example0 except the Python program handles time and parameter subsetting and creation of HAPI CSV and Binary. See section 2.2.
  • Example3.json - Same as Example2 except for HAPI info metadata for each dataset is stored in an external file.
  • Example4.json - Same as Example2 except for HAPI info metadata for each dataset is generated by a command-line command.
  • Example5.json - Same as Example2 except catalog metadata is stored in an external file.
  • Example6.json - Same as Example2 except catalog metadata is generated by a command-line command.
  • Example7.json - Same as Example2 except that catalog metadata is returned from a URL.
  • Example8.json - A dataset in headerless HAPI CSV format is stored in a single file; the server handles parameter and time subsetting and creation of HAPI JSON and Binary.
  • Example9.json - A dataset in headerless HAPI CSV format is returned by a URL; the server handles parameter and time subsetting and creation of HAPI JSON and Binary.
  • AutoplotExample1.json - A dataset is stored in multiple files and AutoplotDataServer is used to subset in time. See section 2.6.
  • AutoplotExample2.json - A dataset is stored in a CDF file and AutoplotDataserver is used to generate HAPI CSV. See section 2.6.
  • TestData.json - A test dataset used to test HAPI clients.
  • SSCWeb.json - Data from a non-HAPI web service is made available from a HAPI server. See section 2.3.
  • INTERMAGNET.json - Data in ASCII files on an FTP site is made available from a HAPI server. See section 2.4.
  • QinDenton.json - Data in a single ASCII file is converted to headerless HAPI CSV by a Python program. See section 2.5.

2.1 Serve data from a minimal Python program

In this example, we assume that the command line program that returns a dataset has the minimal capabilities required - when executed, it generates a headerless HAPI CSV file with all parameters in the dataset over the full time range of available data. The server handles time and parameter subsetting and the generation of HAPI Binary and JSON.

The Python script Example.py returns HAPI-formatted CSV data (with no header) with two parameters. To serve this data, only a configuration file, Example0.json, is needed. The configuration file has information that is used to call the command line program and it also has HAPI metadata that describes the output of Example.py. Details about the configuration file format are described in the Metadata section.

The Python calling syntax of Example.py is

python Example.py

To run this example locally after installation, execute

./hapi-server --file metdata/Example0.json

and then open http://localhost:8999/Example1/hapi. You should see the same landing page as that at http://hapi-server.org/servers/Example0/hapi. Note that the --open command-line switch can be used to automatically open the landing page, e.g.,

./hapi-server --file metdata/Example0.json --open

2.2 Serve data from an enhanced Python program

The Python script Example.py actually can subset parameters and time and provide binary output. To force the server to use these capabilities, we need to modify the server configuration metadata in Example1.json. The changes are replacing

"command": "python bin/Example.py"

with

"command": "python bin/Example.py --params ${parameters} --start ${start} --stop ${stop} --fmt ${format}"

and adding

"formats": ["csv","binary"]

The modified file is Example2.json. To run this example locally after installation, execute

./hapi-server --file metadata/Example2.json

and then open http://localhost:8999/Example2/hapi. The command-line program now produces binary output and performs parameter subsetting as needed and the response time for data should decrease.

The server responses will be identical to that in the previous example. You should see the same landing page as that at http://hapi-server.org/servers/Example2/hapi.

2.3 Serve data from a non-HAPI web service

A non-HAPI server can be quickly made HAPI compliant by using this server as a pass-through. Data from SSCWeb, which is available from a REST API, has been made available through a HAPI API at http://hapi-server.org/servers/SSCWeb/hapi. The configuration file is SSCWeb.json and the command line program is SSCWeb.js. Note that the metadata file SSCWeb.json was created using code in metadata/SSCWeb.

To run this example locally after installation, execute

./hapi-server --file metadata/SSCWeb.json --open

You should see the same landing page as that at http://hapi-server.org/servers/SSCWeb/hapi.

2.4 Serve data stored in a single file

The Qin-Denton dataset contains multiple parameters stored in a single large file.

The command-line program that produces HAPI CSV from this file is QinDenton.py and the metadata is in QinDenton.json.

To run this example, use

./hapi-server --file metadata/QinDenton.json

2.5 Serve data stored in multiple files

INTERMAGNET has ground magnetometer data stored in daily files from over 150 magnetometer stations at 1-minute and 1-second cadence made available from a FTP site.

The command-line program that produces HAPI CSV is INTERMAGNET.py and the metadata is in INTERMAGNET.json. The code that produces the metadata is in metadata/INTERMAGNET. To run this example, execute

./hapi-server --file metadata/INTERMAGNET.json --open

2.6 Serve data read by Autoplot

Nearly any data file that can be read by Autoplot can be served using this server.

Serving data requires at most two steps:

  1. Generating an Autoplot URI for each parameter; and (in some cases)
  2. Writing (by hand) metadata for each parameter.

Example 1

The first example serves data stored in a single CDF file. The configuration file is AutoplotExample1.json.

In this example, step 2. above (writing metadata by hand) is not required because the data file has metadata that is in a format that Autoplot can translate to HAPI metadata.

To run this example locally, execute

./hapi-server --file metadata/AutoplotExample1.json

Example 2

The second example serves data stored in multiple ASCII files. The configuration file is AutoplotExample2.json.

To run this example locally, execute

./hapi-server --file metadata/AutoplotExample2.json

3. Usage

List command-line options:

./hapi-server -h

  --help, -h    Show help 
  --file, -f    Catalog configuration file
  --port, -p    Server port [default:8999]             
  --conf, -c    Server configuration file
  --ignore, -i  Start server even if metadata errors
  --open, -o    Open web page on start
  --test, -t    Run URL tests and exit
  --verify, -v  Run verification tests and exit

Basic usage:

./hapi-server --file metdata/TestData.json

Starts HAPI server at http://localhost:8999/TestData/hapi and serves datasets specified in the catalog ./metadata/TestData.json.

Multiple catalogs can be served by providing multiple catalog files on the command line:

./hapi-server --file CATALOG1.json --file CATALOG2.json

For example

./hapi-server --file metadata/TestData.json --file metadata/Example1.json

will serve the two datasets at

http://localhost:8999/TestData/hapi
http://localhost:8999/Example1/hapi

And the page at http://localhost:8999/ will point to these two URLs.

4. Server Configuration

4.1 conf/config.json

The variables HAPISERVERPATH, HAPISERVERHOME, NODEEXE, and PYTHONEXE can be set in conf/config.json or as environment variables. These variables can be used in commands, files, and URLs in the server metadata (the file passed using the command-line --file switch).

The default configuration file is conf/config.json and this location can be set using a command-line argument, e.g.,

./hapiserver -c /tmp/config.json

To set variables using environment variables, use, e.g.,

PYTHONEXE=/opt/python/bin/python ./hapi-server

Variables set as environment variables take precedence over those set in conf/config.json.

HAPISERVERPATH and HAPISERVERHOME

These two variables can be used in metadata to reference a directory. For example,

"catalog": "$HAPISERVERHOME/mymetadata/Data.json"

By default, $HAPISERVERPATH is the installation directory (the directory containing the shell launch script hapi-server) and should not be changed as it is referenced in the demonstration metadata files. Modify HAPISERVERHOME in conf/config.json to use a custom path.

All relative paths in commands in metadata files are relative to the directory where hapi-server was executed.

For example, if

/tmp/hapi-server

is executed from /home/username, the file

/home/username/metadata/TestData.json`

is read and relative paths in TestData.json have /home/username/ prepended.

PYTHONEXE

This is the command used to call Python. By default, it is python. If python is not in the path, this can be set using a relative or absolute path. Python is used by several of the demonstration catalogs.

Example:

"command": "$PYTHONEXE $HAPISERVERHOME/mybin/Data.py"

NODEEXE

This is the command used to call NodeJS. By default, it is the command used to start the server. The start-up script looks for a NodeJS executable in $HAPISERVERPATH/bin and then tries node and then nodejs.

4.2 Apache

To expose a URL through Apache, (1) enable mod_proxy and mod_proxy_http, (2) add the following in a <VirtualHost> node in a Apache Virtual Hosts file

<VirtualHost *:80>
    ProxyPass /TestData http://localhost:8999/TestData retry=1
    ProxyPassReverse /TestData http://localhost:8999/TestData
</VirtualHost>

and (3) Include this file in the Apache start-up configuration file.

If serving multiple catalogs, use

<VirtualHost *:80>
    ProxyPass /servers http://localhost:8999/servers retry=1
    ProxyPassReverse /servers http://localhost:8999/servers
</VirtualHost>

4.3 Nginx

For Nginx, add the following to nginx.conf

location /TestData {
    proxy_pass http://localhost:8999/TestData;
}

If serving multiple catalogs, use

location /servers {
    proxy_pass http://localhost:8999/servers;
}

5. Metadata

The metadata required for this server is similar to the /catalog and /info response of a HAPI server.

The server requires that the /catalog response is combined with the /info response for all datasets in the catalog in a single JSON catalog configuration file. Additional information about how to generate data must also be included in this JSON file.

The top-level structure of the configuration file is

{
    "server": { // See section 5.1
        "id": "",
        "prefix": "",
        "landing": "",
        "contact": "", 
        "landingFile": "",
        "landingPath": "",
        "catalog-update": null
    },
    "catalog": array or string // See section 5.2 
    "data": { // See section 5.3
        "command": "Command line template",
         or
        "file": "HAPI CSV file"
        "fileformat": "one of 'csv', 'binary', 'json'"
         or
        "url": "URL that returns HAPI data"
        "urlformat": "one of 'csv', 'binary', 'json'"
        "contact": "Email address if error in command line program",
        "testcommands": [
                {
                    "command": string,  
                    "Nlines": integer,
                    "Nbytes": integer,
                    "Ncommas", integer
                },
                ...
            ]
        "testurls": [
                {
                    "url": string,  
                    "Nlines": integer, 
                    "Nbytes": integer,  
                    "Ncommas": integer
                },
                ...
            ]
    },

}

A variety of examples are given in ./metadata and described below along with options for the catalog property.

The string command in the data node is a command that produces a headerless HAPI data response and can have placeholders for time range of data to return (using start (${start}) and stop (${stop})), a dataset id (${id}), a comma-separated list of parameters (${parameters}) and an output format (${format}). For example,

python ./bin/Example.py --dataset ${id} --parameters \
    ${parameters} --start ${start} --stop ${stop} --format ${format}"`

5.1 server

The server node has the form

"server": {
    "id": "",         // Default is file name without extension.
    "prefix": "",     // Default is id.
    "contact": "",     // Required. Server will not start without this set.
    "landingFile": "",
    "landingPath": "",
    "catalog-update": null // How often in seconds to re-read content
                           // in the catalog node (5.2).
}
5.1.1 id and prefix

The id is by default the name of the server configuration file, e.g.,

./hapi-server --file metadata/TestData.json

then id=TestData and prefix=TestData.

By default, this catalog would be served from

http://localhost:8999/TestData/hapi

TestData in the URL can be changed to TestData2 by using prefix=TestData2.

5.1.2 contact

This element must not be empty or the server will not start. It should be at minimum the email address of a system administrator.

5.1.3 landingFile and landingPath

landingFile is the file to serve in response to requests for

http://localhost:8999/TestData/hapi

By default, the landing page served is single.htm from the HAPI server UI codebase. The double underscore variables in this file are replaced using the information in the metadata file (e.g., __CONTACT__ is replaced with the server.contact value. A different landing page can be served by setting the landingFile configuration variable, e.g. "landingFile": "$HAPISERVERPATH/public/index.htm", where $HAPISERVERPATH is described in Server Configuration.

If landingFile has local CSS and JS dependencies, set landingPath to be the local directory of the referenced files. Several possible settings are

    "landingFile": "$HAPISERVERPATH/index.htm", 
    // $HAPISERVERPATH will be replaced with location of hapi-server binary
    "landingPath": "/var/www/public/" // Location of CSS and JS files
    // If index.htm has <script src="index.js">, index.js should be in /var/www/public/

To serve a directory listing, use

    "landingFile": "",
    "landingPath": "/var/www/public/"
    // Server will look for index.htm and index.html in /var/www/public/. If not
    // found, directory listing of /var/www/public/ will be served.
5.1.4 catalog-update

This is an integer number of seconds corresponding to how often the catalog node should be updated. Use this if the catalog node is not static.

5.2 catalog

The catalog node can be either a string or an array.

In the case that it is an array, it should contain either the combined HAPI /catalog and /info response (5.2.1) or a /catalog response with references to the \info response (5.2.1).

In the case that it is a string (5.2.3), the string is either a file containing a catalog array or a command-line template that returns a catalog array.

5.2.1 Combined HAPI /catalog and /info object

If catalog is an array, it should have the same format as a HAPI /catalog response (each object in the array has an id property and optional title property) with the addition of an info property that is the HAPI response for that id, e.g., /info?id=dataset1.

"catalog":
 [
    {
        "id": "dataset1",
        "title": "a dataset",
        "info": {
                "startDate": "2000-01-01Z",
                "stopDate": "2000-01-02Z",
                "parameters": [...]
        }
    },
    {
        "id": "dataset2",
        "title": "another dataset",
        "info": {
            "startDate": "2000-01-01Z",
            "stopDate": "2000-01-02Z",
            "parameters": [...]
        }
    }
 ]

In the following subsections, this type of JSON structure is referred to as a fully resolved catalog.

Examples of this type of catalog include

5.2.2 /catalog response with file or command template for info object

The info value can be a path to an info JSON file

"catalog": 
 [
    {
        "id": "dataset1",
        "title": "a dataset",
        "info": "relativepath/to/dataset2/info_file.json"
    },
    {
        "id": "dataset2",
        "title": "another dataset",
        "info": "/absolutepath/to/dataset2/info_file.json"
    }
 ]

See also Example3.json.

Alternatively, the metadata for each dataset may be produced by the execution of a command-line program for each dataset. For example, in the following, program1 should result in a HAPI JSON response from /info?id=dataset1 to stdout. Before execution, the string ${id}, if found, is replaced with the requested dataset ID. Execution of program2 should produce the HAPI JSON corresponding to the query /info?id=dataset2.

"catalog":
 [
    {
        "id": "dataset1",
        "title": "a dataset",
        "info": "bin/program --id ${id}" 
    },
    {
        "id": "dataset2",
        "title": "another dataset",
        "info": "program2"
    }
 ]

See also Example4.json.

5.2.3 References to a command-line template or file

The catalog value can be a command-line program that generates a fully resolved catalog, e.g.,

"catalog": "program --arg1 val1 ..."

The command-line command should return the response of an /info query (with no id argument).

The path to a fully resolved catalog can also be given. See also Example5.json.

5.3 data

6. Development

6.1 Installation

Install nodejs (tested with v8) using either the standard installer or NVM.

Show NVM installation notes

See also https://github.com/nvm-sh/nvm#install--update-script

# Install Node Version Manager
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.35.3/install.sh | bash

# Open a new shell (see displayed instructions from above command)

# Install and use node.js version 8
nvm install 8
# Clone the server repository
git clone https://github.com/hapi-server/server-nodejs

# Install dependencies
cd server-nodejs; npm install

# Start server
node server.js

# Run tests; Python 2.7+ required for certain tests.
npm test

7. Contact

Please submit questions, bug reports, and feature requests to the issue tracker.

Download Details:

Author: hapi-server/

Source Code: https://github.com/hapi-server/server-nodejs

nodejs node javascript

Bootstrap 5 Complete Course with Examples

Bootstrap 5 Tutorial - Bootstrap 5 Crash Course for Beginners

Nest.JS Tutorial for Beginners

Hello Vue 3: A First Look at Vue 3 and the Composition API

Building a simple Applications with Vue 3

Deno Crash Course: Explore Deno and Create a full REST API with Deno

How to Build a Real-time Chat App with Deno and WebSockets

Convert HTML to Markdown Online

HTML entity encoder decoder Online

Node canvas is a Cairo backed Canvas implementation for NodeJS.

node-canvas is a Cairo-backed Canvas implementation for Node.js.

How to Hire Node.js Developers And How Much Does It Cost?

A Guide to Hire Node.js Developers who can help you create fast and efficient web applications. Also, know how much does it cost to hire Node.js Developers.

Hire NodeJs Developer

Looking to build dynamic, extensively featured, and full-fledged web applications? **[Hire NodeJs Developer](https://hourlydeveloper.io/hire-dedicated-node-js-developer/ "Hire NodeJs Developer")** to create a real-time, faster, and scalable...

Decoding Nodejs

The main goal of this blog is to explain the “Architecture of Nodejs” and to know how the Nodejs works behind the scenes. Generally, most of the server-side languages, like PHP, ASP.NET, Ruby, and including Nodejs follows multi-threaded architecture. That means for each client-side request initiates a new thread or even a new process.

Node.js for Beginners - Learn Node.js from Scratch (Step by Step)

Node.js for Beginners - Learn Node.js from Scratch (Step by Step) - Learn the basics of Node.js. This Node.js tutorial will guide you step by step so that you will learn basics and theory of every part. Learn to use Node.js like a professional. You’ll learn: Basic Of Node, Modules, NPM In Node, Event, Email, Uploading File, Advance Of Node.