Miniya Tewelde

Miniya Tewelde

1595478480

FastAPI is a modern, fast, web framework for building APIs with Python 3.6

FastAPI is a modern, fast (high-performance), web framework for building APIs with Python 3.6+ based on standard Python type hints.

The key features are:

  • Fast: Very high performance, on par with NodeJS and Go (thanks to Starlette and Pydantic). One of the fastest Python frameworks available.

  • Fast to code: Increase the speed to develop features by about 200% to 300%. *

  • Fewer bugs: Reduce about 40% of human (developer) induced errors. *

  • Intuitive: Great editor support. Completion everywhere. Less time debugging.

  • Easy: Designed to be easy to use and learn. Less time reading docs.

  • Short: Minimize code duplication. Multiple features from each parameter declaration. Fewer bugs.

  • Robust: Get production-ready code. With automatic interactive documentation.

  • Standards-based: Based on (and fully compatible with) the open standards for APIs: OpenAPI (previously known as Swagger) and JSON Schema.

  • estimation based on tests on an internal development team, building production applications.

Opinions

[…] I’m using FastAPI a ton these days. […] I’m actually planning to use it for all of my team’s ML services at Microsoft. Some of them are getting integrated into the core Windows product and some Office products.

Kabir Khan - Microsoft (ref)

We adopted the FastAPI library to spawn a REST server that can be queried to obtain predictions. [for Ludwig]

Piero Molino, Yaroslav Dudin, and Sai Sumanth Miryala - Uber (ref)

Netflix is pleased to announce the open-source release of our crisis management orchestration framework: Dispatch! [built with FastAPI]

Kevin Glisson, Marc Vilanova, Forest Monsen - Netflix (ref)

I’m over the moon excited about FastAPI. It’s so fun!

Brian Okken - Python Bytes podcast host (ref)

Honestly, what you’ve built looks super solid and polished. In many ways, it’s what I wanted Hug to be - it’s really inspiring to see someone build that.

Timothy Crosley - Hug creator (ref)

If you’re looking to learn one modern framework for building REST APIs, check out FastAPI […] It’s fast, easy to use and easy to learn […]

We’ve switched over to FastAPI for our APIs […] I think you’ll like it […]

Ines Montani - Matthew Honnibal - Explosion AI founders - spaCy creators (ref) - (ref)

Typer, the FastAPI of CLIs

If you are building a CLI app to be used in the terminal instead of a web API, check out Typer.

Typer is FastAPI’s little sibling. And it’s intended to be the FastAPI of CLIs. ⌨️ 🚀

Requirements

Python 3.6+

FastAPI stands on the shoulders of giants:

Installation

$ pip install fastapi

---> 100%

You will also need an ASGI server, for production such as Uvicorn or Hypercorn.

$ pip install uvicorn

---> 100%

Example

Create it

  • Create a file main.py with:
from typing import Optional

from fastapi import FastAPI

app = FastAPI()

@app.get("/")
def read_root():
    return {"Hello": "World"}

@app.get("/items/{item_id}")
def read_item(item_id: int, q: Optional[str] = None):
    return {"item_id": item_id, "q": q}

Or use async def

Run it

Run the server with:

$ uvicorn main:app --reload

INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO:     Started reloader process [28720]
INFO:     Started server process [28722]
INFO:     Waiting for application startup.
INFO:     Application startup complete.

About the command uvicorn main:app --reload

Check it

Open your browser at http://127.0.0.1:8000/items/5?q=somequery.

You will see the JSON response as:

{"item_id": 5, "q": "somequery"}

You already created an API that:

  • Receives HTTP requests in the paths / and /items/{item_id}.
  • Both paths take GET operations (also known as HTTP methods).
  • The path /items/{item_id} has a path parameter item_id that should be an int.
  • The path /items/{item_id} has an optional str query parameter q.

Interactive API docs

Now go to http://127.0.0.1:8000/docs.

You will see the automatic interactive API documentation (provided by Swagger UI):

Swagger UI

Alternative API docs

And now, go to http://127.0.0.1:8000/redoc.

You will see the alternative automatic documentation (provided by ReDoc):

ReDoc

Example upgrade

Now modify the file main.py to receive a body from a PUT request.

Declare the body using standard Python types, thanks to Pydantic.

from typing import Optional

from fastapi import FastAPI
from pydantic import BaseModel

app = FastAPI()

class Item(BaseModel):
    name: str
    price: float
    is_offer: Optional[bool] = None

@app.get("/")
def read_root():
    return {"Hello": "World"}

@app.get("/items/{item_id}")
def read_item(item_id: int, q: Optional[str] = None):
    return {"item_id": item_id, "q": q}

@app.put("/items/{item_id}")
def update_item(item_id: int, item: Item):
    return {"item_name": item.name, "item_id": item_id}

The server should reload automatically (because you added --reload to the uvicorn command above).

Interactive API docs upgrade

Now go to http://127.0.0.1:8000/docs.

  • The interactive API documentation will be automatically updated, including the new body:

Swagger UI

  • Click on the button “Try it out”, it allows you to fill the parameters and directly interact with the API:

Swagger UI interaction

  • Then click on the “Execute” button, the user interface will communicate with your API, send the parameters, get the results and show them on the screen:

Swagger UI interaction

Alternative API docs upgrade

And now, go to http://127.0.0.1:8000/redoc.

  • The alternative documentation will also reflect the new query parameter and body:

ReDoc

Recap

In summary, you declare once the types of parameters, body, etc. as function parameters.

You do that with standard modern Python types.

You don’t have to learn a new syntax, the methods or classes of a specific library, etc.

Just standard Python 3.6+.

For example, for an int:

item_id: int

or for a more complex Item model:

item: Item

…and with that single declaration you get:

  • Editor support, including:
  • Completion.
  • Type checks.
  • Validation of data:
  • Automatic and clear errors when the data is invalid.
  • Validation even for deeply nested JSON objects.
  • Conversion of input data: coming from the network to Python data and types. Reading from:
  • JSON.
  • Path parameters.
  • Query parameters.
  • Cookies.
  • Headers.
  • Forms.
  • Files.
  • Conversion of output data: converting from Python data and types to network data (as JSON):
  • Convert Python types (strintfloatboollist, etc).
  • datetime objects.
  • UUID objects.
  • Database models.
  • …and many more.
  • Automatic interactive API documentation, including 2 alternative user interfaces:
  • Swagger UI.
  • ReDoc.

Coming back to the previous code example, FastAPI will:

  • Validate that there is an item_id in the path for GET and PUT requests.
  • Validate that the item_id is of type int for GET and PUT requests.
  • If it is not, the client will see a useful, clear error.
  • Check if there is an optional query parameter named q (as in http://127.0.0.1:8000/items/foo?q=somequery) for GET requests.
  • As the q parameter is declared with = None, it is optional.
  • Without the None it would be required (as is the body in the case with PUT).
  • For PUT requests to /items/{item_id}, Read the body as JSON:
  • Check that it has a required attribute name that should be a str.
  • Check that it has a required attribute price that has to be a float.
  • Check that it has an optional attribute is_offer, that should be a bool, if present.
  • All this would also work for deeply nested JSON objects.
  • Convert from and to JSON automatically.
  • Document everything with OpenAPI, that can be used by:
  • Interactive documentation systems.
  • Automatic client code generation systems, for many languages.
  • Provide 2 interactive documentation web interfaces directly.

We just scratched the surface, but you already get the idea of how it all works.

Try changing the line with:

    return {"item_name": item.name, "item_id": item_id}

…from:

        ... "item_name": item.name ...

…to:

        ... "item_price": item.price ...

…and see how your editor will auto-complete the attributes and know their types:

editor support

For a more complete example including more features, see the Tutorial - User Guide.

Spoiler alert: the tutorial - user guide includes:

  • Declaration of parameters from other different places as: headers, cookies, form fields and files.
  • How to set validation constraints as maximum_length or regex.
  • A very powerful and easy to use Dependency Injection system.
  • Security and authentication, including support for OAuth2 with JWT tokens and HTTP Basic auth.
  • More advanced (but equally easy) techniques for declaring deeply nested JSON models (thanks to Pydantic).
  • Many extra features (thanks to Starlette) as:
  • WebSockets
  • GraphQL
  • extremely easy tests based on requests and pytest
  • CORS
  • Cookie Sessions
  • …and more.

Performance

Independent TechEmpower benchmarks show FastAPI applications running under Uvicorn as one of the fastest Python frameworks available, only below Starlette and Uvicorn themselves (used internally by FastAPI). (*)

To understand more about it, see the section Benchmarks.

Optional Dependencies

Used by Pydantic:

  • [ujson](https://github.com/esnme/ultrajson) - for faster JSON “parsing”.
  • [email_validator](https://github.com/JoshData/python-email-validator) - for email validation.

Used by Starlette:

  • [requests](http://docs.python-requests.org/) - Required if you want to use the TestClient.
  • [aiofiles](https://github.com/Tinche/aiofiles) - Required if you want to use FileResponse or StaticFiles.
  • [jinja2](http://jinja.pocoo.org/) - Required if you want to use the default template configuration.
  • [python-multipart](https://andrew-d.github.io/python-multipart/) - Required if you want to support form “parsing”, with request.form().
  • [itsdangerous](https://pythonhosted.org/itsdangerous/) - Required for SessionMiddleware support.
  • [pyyaml](https://pyyaml.org/wiki/PyYAMLDocumentation) - Required for Starlette’s SchemaGenerator support (you probably don’t need it with FastAPI).
  • [graphene](https://graphene-python.org/) - Required for GraphQLApp support.
  • [ujson](https://github.com/esnme/ultrajson) - Required if you want to use UJSONResponse.

Used by FastAPI / Starlette:

  • [uvicorn](http://www.uvicorn.org/) - for the server that loads and serves your application.
  • [orjson](https://github.com/ijl/orjson) - Required if you want to use ORJSONResponse.

You can install all of these with pip install fastapi[all]

Download Details:

Author: tiangolo

Live Demo: https://fastapi.tiangolo.com/

GitHub: https://github.com/tiangolo/fastapi

#python

What is GEEK

Buddha Community

FastAPI is a modern, fast, web framework for building APIs with Python 3.6
Veronica  Roob

Veronica Roob

1653475560

A Pure PHP Implementation Of The MessagePack Serialization Format

msgpack.php

A pure PHP implementation of the MessagePack serialization format.

Features

Installation

The recommended way to install the library is through Composer:

composer require rybakit/msgpack

Usage

Packing

To pack values you can either use an instance of a Packer:

$packer = new Packer();
$packed = $packer->pack($value);

or call a static method on the MessagePack class:

$packed = MessagePack::pack($value);

In the examples above, the method pack automatically packs a value depending on its type. However, not all PHP types can be uniquely translated to MessagePack types. For example, the MessagePack format defines map and array types, which are represented by a single array type in PHP. By default, the packer will pack a PHP array as a MessagePack array if it has sequential numeric keys, starting from 0 and as a MessagePack map otherwise:

$mpArr1 = $packer->pack([1, 2]);               // MP array [1, 2]
$mpArr2 = $packer->pack([0 => 1, 1 => 2]);     // MP array [1, 2]
$mpMap1 = $packer->pack([0 => 1, 2 => 3]);     // MP map {0: 1, 2: 3}
$mpMap2 = $packer->pack([1 => 2, 2 => 3]);     // MP map {1: 2, 2: 3}
$mpMap3 = $packer->pack(['a' => 1, 'b' => 2]); // MP map {a: 1, b: 2}

However, sometimes you need to pack a sequential array as a MessagePack map. To do this, use the packMap method:

$mpMap = $packer->packMap([1, 2]); // {0: 1, 1: 2}

Here is a list of type-specific packing methods:

$packer->packNil();           // MP nil
$packer->packBool(true);      // MP bool
$packer->packInt(42);         // MP int
$packer->packFloat(M_PI);     // MP float (32 or 64)
$packer->packFloat32(M_PI);   // MP float 32
$packer->packFloat64(M_PI);   // MP float 64
$packer->packStr('foo');      // MP str
$packer->packBin("\x80");     // MP bin
$packer->packArray([1, 2]);   // MP array
$packer->packMap(['a' => 1]); // MP map
$packer->packExt(1, "\xaa");  // MP ext

Check the "Custom types" section below on how to pack custom types.

Packing options

The Packer object supports a number of bitmask-based options for fine-tuning the packing process (defaults are in bold):

NameDescription
FORCE_STRForces PHP strings to be packed as MessagePack UTF-8 strings
FORCE_BINForces PHP strings to be packed as MessagePack binary data
DETECT_STR_BINDetects MessagePack str/bin type automatically
  
FORCE_ARRForces PHP arrays to be packed as MessagePack arrays
FORCE_MAPForces PHP arrays to be packed as MessagePack maps
DETECT_ARR_MAPDetects MessagePack array/map type automatically
  
FORCE_FLOAT32Forces PHP floats to be packed as 32-bits MessagePack floats
FORCE_FLOAT64Forces PHP floats to be packed as 64-bits MessagePack floats

The type detection mode (DETECT_STR_BIN/DETECT_ARR_MAP) adds some overhead which can be noticed when you pack large (16- and 32-bit) arrays or strings. However, if you know the value type in advance (for example, you only work with UTF-8 strings or/and associative arrays), you can eliminate this overhead by forcing the packer to use the appropriate type, which will save it from running the auto-detection routine. Another option is to explicitly specify the value type. The library provides 2 auxiliary classes for this, Map and Bin. Check the "Custom types" section below for details.

Examples:

// detect str/bin type and pack PHP 64-bit floats (doubles) to MP 32-bit floats
$packer = new Packer(PackOptions::DETECT_STR_BIN | PackOptions::FORCE_FLOAT32);

// these will throw MessagePack\Exception\InvalidOptionException
$packer = new Packer(PackOptions::FORCE_STR | PackOptions::FORCE_BIN);
$packer = new Packer(PackOptions::FORCE_FLOAT32 | PackOptions::FORCE_FLOAT64);

Unpacking

To unpack data you can either use an instance of a BufferUnpacker:

$unpacker = new BufferUnpacker();

$unpacker->reset($packed);
$value = $unpacker->unpack();

or call a static method on the MessagePack class:

$value = MessagePack::unpack($packed);

If the packed data is received in chunks (e.g. when reading from a stream), use the tryUnpack method, which attempts to unpack data and returns an array of unpacked messages (if any) instead of throwing an InsufficientDataException:

while ($chunk = ...) {
    $unpacker->append($chunk);
    if ($messages = $unpacker->tryUnpack()) {
        return $messages;
    }
}

If you want to unpack from a specific position in a buffer, use seek:

$unpacker->seek(42); // set position equal to 42 bytes
$unpacker->seek(-8); // set position to 8 bytes before the end of the buffer

To skip bytes from the current position, use skip:

$unpacker->skip(10); // set position to 10 bytes ahead of the current position

To get the number of remaining (unread) bytes in the buffer:

$unreadBytesCount = $unpacker->getRemainingCount();

To check whether the buffer has unread data:

$hasUnreadBytes = $unpacker->hasRemaining();

If needed, you can remove already read data from the buffer by calling:

$releasedBytesCount = $unpacker->release();

With the read method you can read raw (packed) data:

$packedData = $unpacker->read(2); // read 2 bytes

Besides the above methods BufferUnpacker provides type-specific unpacking methods, namely:

$unpacker->unpackNil();   // PHP null
$unpacker->unpackBool();  // PHP bool
$unpacker->unpackInt();   // PHP int
$unpacker->unpackFloat(); // PHP float
$unpacker->unpackStr();   // PHP UTF-8 string
$unpacker->unpackBin();   // PHP binary string
$unpacker->unpackArray(); // PHP sequential array
$unpacker->unpackMap();   // PHP associative array
$unpacker->unpackExt();   // PHP MessagePack\Type\Ext object

Unpacking options

The BufferUnpacker object supports a number of bitmask-based options for fine-tuning the unpacking process (defaults are in bold):

NameDescription
BIGINT_AS_STRConverts overflowed integers to strings [1]
BIGINT_AS_GMPConverts overflowed integers to GMP objects [2]
BIGINT_AS_DECConverts overflowed integers to Decimal\Decimal objects [3]

1. The binary MessagePack format has unsigned 64-bit as its largest integer data type, but PHP does not support such integers, which means that an overflow can occur during unpacking.

2. Make sure the GMP extension is enabled.

3. Make sure the Decimal extension is enabled.

Examples:

$packedUint64 = "\xcf"."\xff\xff\xff\xff"."\xff\xff\xff\xff";

$unpacker = new BufferUnpacker($packedUint64);
var_dump($unpacker->unpack()); // string(20) "18446744073709551615"

$unpacker = new BufferUnpacker($packedUint64, UnpackOptions::BIGINT_AS_GMP);
var_dump($unpacker->unpack()); // object(GMP) {...}

$unpacker = new BufferUnpacker($packedUint64, UnpackOptions::BIGINT_AS_DEC);
var_dump($unpacker->unpack()); // object(Decimal\Decimal) {...}

Custom types

In addition to the basic types, the library provides functionality to serialize and deserialize arbitrary types. This can be done in several ways, depending on your use case. Let's take a look at them.

Type objects

If you need to serialize an instance of one of your classes into one of the basic MessagePack types, the best way to do this is to implement the CanBePacked interface in the class. A good example of such a class is the Map type class that comes with the library. This type is useful when you want to explicitly specify that a given PHP array should be packed as a MessagePack map without triggering an automatic type detection routine:

$packer = new Packer();

$packedMap = $packer->pack(new Map([1, 2, 3]));
$packedArray = $packer->pack([1, 2, 3]);

More type examples can be found in the src/Type directory.

Type transformers

As with type objects, type transformers are only responsible for serializing values. They should be used when you need to serialize a value that does not implement the CanBePacked interface. Examples of such values could be instances of built-in or third-party classes that you don't own, or non-objects such as resources.

A transformer class must implement the CanPack interface. To use a transformer, it must first be registered in the packer. Here is an example of how to serialize PHP streams into the MessagePack bin format type using one of the supplied transformers, StreamTransformer:

$packer = new Packer(null, [new StreamTransformer()]);

$packedBin = $packer->pack(fopen('/path/to/file', 'r+'));

More type transformer examples can be found in the src/TypeTransformer directory.

Extensions

In contrast to the cases described above, extensions are intended to handle extension types and are responsible for both serialization and deserialization of values (types).

An extension class must implement the Extension interface. To use an extension, it must first be registered in the packer and the unpacker.

The MessagePack specification divides extension types into two groups: predefined and application-specific. Currently, there is only one predefined type in the specification, Timestamp.

Timestamp

The Timestamp extension type is a predefined type. Support for this type in the library is done through the TimestampExtension class. This class is responsible for handling Timestamp objects, which represent the number of seconds and optional adjustment in nanoseconds:

$timestampExtension = new TimestampExtension();

$packer = new Packer();
$packer = $packer->extendWith($timestampExtension);

$unpacker = new BufferUnpacker();
$unpacker = $unpacker->extendWith($timestampExtension);

$packedTimestamp = $packer->pack(Timestamp::now());
$timestamp = $unpacker->reset($packedTimestamp)->unpack();

$seconds = $timestamp->getSeconds();
$nanoseconds = $timestamp->getNanoseconds();

When using the MessagePack class, the Timestamp extension is already registered:

$packedTimestamp = MessagePack::pack(Timestamp::now());
$timestamp = MessagePack::unpack($packedTimestamp);

Application-specific extensions

In addition, the format can be extended with your own types. For example, to make the built-in PHP DateTime objects first-class citizens in your code, you can create a corresponding extension, as shown in the example. Please note, that custom extensions have to be registered with a unique extension ID (an integer from 0 to 127).

More extension examples can be found in the examples/MessagePack directory.

To learn more about how extension types can be useful, check out this article.

Exceptions

If an error occurs during packing/unpacking, a PackingFailedException or an UnpackingFailedException will be thrown, respectively. In addition, an InsufficientDataException can be thrown during unpacking.

An InvalidOptionException will be thrown in case an invalid option (or a combination of mutually exclusive options) is used.

Tests

Run tests as follows:

vendor/bin/phpunit

Also, if you already have Docker installed, you can run the tests in a docker container. First, create a container:

./dockerfile.sh | docker build -t msgpack -

The command above will create a container named msgpack with PHP 8.1 runtime. You may change the default runtime by defining the PHP_IMAGE environment variable:

PHP_IMAGE='php:8.0-cli' ./dockerfile.sh | docker build -t msgpack -

See a list of various images here.

Then run the unit tests:

docker run --rm -v $PWD:/msgpack -w /msgpack msgpack

Fuzzing

To ensure that the unpacking works correctly with malformed/semi-malformed data, you can use a testing technique called Fuzzing. The library ships with a help file (target) for PHP-Fuzzer and can be used as follows:

php-fuzzer fuzz tests/fuzz_buffer_unpacker.php

Performance

To check performance, run:

php -n -dzend_extension=opcache.so \
-dpcre.jit=1 -dopcache.enable=1 -dopcache.enable_cli=1 \
tests/bench.php

Example output

Filter: MessagePack\Tests\Perf\Filter\ListFilter
Rounds: 3
Iterations: 100000

=============================================
Test/Target            Packer  BufferUnpacker
---------------------------------------------
nil .................. 0.0030 ........ 0.0139
false ................ 0.0037 ........ 0.0144
true ................. 0.0040 ........ 0.0137
7-bit uint #1 ........ 0.0052 ........ 0.0120
7-bit uint #2 ........ 0.0059 ........ 0.0114
7-bit uint #3 ........ 0.0061 ........ 0.0119
5-bit sint #1 ........ 0.0067 ........ 0.0126
5-bit sint #2 ........ 0.0064 ........ 0.0132
5-bit sint #3 ........ 0.0066 ........ 0.0135
8-bit uint #1 ........ 0.0078 ........ 0.0200
8-bit uint #2 ........ 0.0077 ........ 0.0212
8-bit uint #3 ........ 0.0086 ........ 0.0203
16-bit uint #1 ....... 0.0111 ........ 0.0271
16-bit uint #2 ....... 0.0115 ........ 0.0260
16-bit uint #3 ....... 0.0103 ........ 0.0273
32-bit uint #1 ....... 0.0116 ........ 0.0326
32-bit uint #2 ....... 0.0118 ........ 0.0332
32-bit uint #3 ....... 0.0127 ........ 0.0325
64-bit uint #1 ....... 0.0140 ........ 0.0277
64-bit uint #2 ....... 0.0134 ........ 0.0294
64-bit uint #3 ....... 0.0134 ........ 0.0281
8-bit int #1 ......... 0.0086 ........ 0.0241
8-bit int #2 ......... 0.0089 ........ 0.0225
8-bit int #3 ......... 0.0085 ........ 0.0229
16-bit int #1 ........ 0.0118 ........ 0.0280
16-bit int #2 ........ 0.0121 ........ 0.0270
16-bit int #3 ........ 0.0109 ........ 0.0274
32-bit int #1 ........ 0.0128 ........ 0.0346
32-bit int #2 ........ 0.0118 ........ 0.0339
32-bit int #3 ........ 0.0135 ........ 0.0368
64-bit int #1 ........ 0.0138 ........ 0.0276
64-bit int #2 ........ 0.0132 ........ 0.0286
64-bit int #3 ........ 0.0137 ........ 0.0274
64-bit int #4 ........ 0.0180 ........ 0.0285
64-bit float #1 ...... 0.0134 ........ 0.0284
64-bit float #2 ...... 0.0125 ........ 0.0275
64-bit float #3 ...... 0.0126 ........ 0.0283
fix string #1 ........ 0.0035 ........ 0.0133
fix string #2 ........ 0.0094 ........ 0.0216
fix string #3 ........ 0.0094 ........ 0.0222
fix string #4 ........ 0.0091 ........ 0.0241
8-bit string #1 ...... 0.0122 ........ 0.0301
8-bit string #2 ...... 0.0118 ........ 0.0304
8-bit string #3 ...... 0.0119 ........ 0.0315
16-bit string #1 ..... 0.0150 ........ 0.0388
16-bit string #2 ..... 0.1545 ........ 0.1665
32-bit string ........ 0.1570 ........ 0.1756
wide char string #1 .. 0.0091 ........ 0.0236
wide char string #2 .. 0.0122 ........ 0.0313
8-bit binary #1 ...... 0.0100 ........ 0.0302
8-bit binary #2 ...... 0.0123 ........ 0.0324
8-bit binary #3 ...... 0.0126 ........ 0.0327
16-bit binary ........ 0.0168 ........ 0.0372
32-bit binary ........ 0.1588 ........ 0.1754
fix array #1 ......... 0.0042 ........ 0.0131
fix array #2 ......... 0.0294 ........ 0.0367
fix array #3 ......... 0.0412 ........ 0.0472
16-bit array #1 ...... 0.1378 ........ 0.1596
16-bit array #2 ........... S ............. S
32-bit array .............. S ............. S
complex array ........ 0.1865 ........ 0.2283
fix map #1 ........... 0.0725 ........ 0.1048
fix map #2 ........... 0.0319 ........ 0.0405
fix map #3 ........... 0.0356 ........ 0.0665
fix map #4 ........... 0.0465 ........ 0.0497
16-bit map #1 ........ 0.2540 ........ 0.3028
16-bit map #2 ............. S ............. S
32-bit map ................ S ............. S
complex map .......... 0.2372 ........ 0.2710
fixext 1 ............. 0.0283 ........ 0.0358
fixext 2 ............. 0.0291 ........ 0.0371
fixext 4 ............. 0.0302 ........ 0.0355
fixext 8 ............. 0.0288 ........ 0.0384
fixext 16 ............ 0.0293 ........ 0.0359
8-bit ext ............ 0.0302 ........ 0.0439
16-bit ext ........... 0.0334 ........ 0.0499
32-bit ext ........... 0.1845 ........ 0.1888
32-bit timestamp #1 .. 0.0337 ........ 0.0547
32-bit timestamp #2 .. 0.0335 ........ 0.0560
64-bit timestamp #1 .. 0.0371 ........ 0.0575
64-bit timestamp #2 .. 0.0374 ........ 0.0542
64-bit timestamp #3 .. 0.0356 ........ 0.0533
96-bit timestamp #1 .. 0.0362 ........ 0.0699
96-bit timestamp #2 .. 0.0381 ........ 0.0701
96-bit timestamp #3 .. 0.0367 ........ 0.0687
=============================================
Total                  2.7618          4.0820
Skipped                     4               4
Failed                      0               0
Ignored                     0               0

With JIT:

php -n -dzend_extension=opcache.so \
-dpcre.jit=1 -dopcache.jit_buffer_size=64M -dopcache.jit=tracing -dopcache.enable=1 -dopcache.enable_cli=1 \
tests/bench.php

Example output

Filter: MessagePack\Tests\Perf\Filter\ListFilter
Rounds: 3
Iterations: 100000

=============================================
Test/Target            Packer  BufferUnpacker
---------------------------------------------
nil .................. 0.0005 ........ 0.0054
false ................ 0.0004 ........ 0.0059
true ................. 0.0004 ........ 0.0059
7-bit uint #1 ........ 0.0010 ........ 0.0047
7-bit uint #2 ........ 0.0010 ........ 0.0046
7-bit uint #3 ........ 0.0010 ........ 0.0046
5-bit sint #1 ........ 0.0025 ........ 0.0046
5-bit sint #2 ........ 0.0023 ........ 0.0046
5-bit sint #3 ........ 0.0024 ........ 0.0045
8-bit uint #1 ........ 0.0043 ........ 0.0081
8-bit uint #2 ........ 0.0043 ........ 0.0079
8-bit uint #3 ........ 0.0041 ........ 0.0080
16-bit uint #1 ....... 0.0064 ........ 0.0095
16-bit uint #2 ....... 0.0064 ........ 0.0091
16-bit uint #3 ....... 0.0064 ........ 0.0094
32-bit uint #1 ....... 0.0085 ........ 0.0114
32-bit uint #2 ....... 0.0077 ........ 0.0122
32-bit uint #3 ....... 0.0077 ........ 0.0120
64-bit uint #1 ....... 0.0085 ........ 0.0159
64-bit uint #2 ....... 0.0086 ........ 0.0157
64-bit uint #3 ....... 0.0086 ........ 0.0158
8-bit int #1 ......... 0.0042 ........ 0.0080
8-bit int #2 ......... 0.0042 ........ 0.0080
8-bit int #3 ......... 0.0042 ........ 0.0081
16-bit int #1 ........ 0.0065 ........ 0.0095
16-bit int #2 ........ 0.0065 ........ 0.0090
16-bit int #3 ........ 0.0056 ........ 0.0085
32-bit int #1 ........ 0.0067 ........ 0.0107
32-bit int #2 ........ 0.0066 ........ 0.0106
32-bit int #3 ........ 0.0063 ........ 0.0104
64-bit int #1 ........ 0.0072 ........ 0.0162
64-bit int #2 ........ 0.0073 ........ 0.0174
64-bit int #3 ........ 0.0072 ........ 0.0164
64-bit int #4 ........ 0.0077 ........ 0.0161
64-bit float #1 ...... 0.0053 ........ 0.0135
64-bit float #2 ...... 0.0053 ........ 0.0135
64-bit float #3 ...... 0.0052 ........ 0.0135
fix string #1 ....... -0.0002 ........ 0.0044
fix string #2 ........ 0.0035 ........ 0.0067
fix string #3 ........ 0.0035 ........ 0.0077
fix string #4 ........ 0.0033 ........ 0.0078
8-bit string #1 ...... 0.0059 ........ 0.0110
8-bit string #2 ...... 0.0063 ........ 0.0121
8-bit string #3 ...... 0.0064 ........ 0.0124
16-bit string #1 ..... 0.0099 ........ 0.0146
16-bit string #2 ..... 0.1522 ........ 0.1474
32-bit string ........ 0.1511 ........ 0.1483
wide char string #1 .. 0.0039 ........ 0.0084
wide char string #2 .. 0.0073 ........ 0.0123
8-bit binary #1 ...... 0.0040 ........ 0.0112
8-bit binary #2 ...... 0.0075 ........ 0.0123
8-bit binary #3 ...... 0.0077 ........ 0.0129
16-bit binary ........ 0.0096 ........ 0.0145
32-bit binary ........ 0.1535 ........ 0.1479
fix array #1 ......... 0.0008 ........ 0.0061
fix array #2 ......... 0.0121 ........ 0.0165
fix array #3 ......... 0.0193 ........ 0.0222
16-bit array #1 ...... 0.0607 ........ 0.0479
16-bit array #2 ........... S ............. S
32-bit array .............. S ............. S
complex array ........ 0.0749 ........ 0.0824
fix map #1 ........... 0.0329 ........ 0.0431
fix map #2 ........... 0.0161 ........ 0.0189
fix map #3 ........... 0.0205 ........ 0.0262
fix map #4 ........... 0.0252 ........ 0.0205
16-bit map #1 ........ 0.1016 ........ 0.0927
16-bit map #2 ............. S ............. S
32-bit map ................ S ............. S
complex map .......... 0.1096 ........ 0.1030
fixext 1 ............. 0.0157 ........ 0.0161
fixext 2 ............. 0.0175 ........ 0.0183
fixext 4 ............. 0.0156 ........ 0.0185
fixext 8 ............. 0.0163 ........ 0.0184
fixext 16 ............ 0.0164 ........ 0.0182
8-bit ext ............ 0.0158 ........ 0.0207
16-bit ext ........... 0.0203 ........ 0.0219
32-bit ext ........... 0.1614 ........ 0.1539
32-bit timestamp #1 .. 0.0195 ........ 0.0249
32-bit timestamp #2 .. 0.0188 ........ 0.0260
64-bit timestamp #1 .. 0.0207 ........ 0.0281
64-bit timestamp #2 .. 0.0212 ........ 0.0291
64-bit timestamp #3 .. 0.0207 ........ 0.0295
96-bit timestamp #1 .. 0.0222 ........ 0.0358
96-bit timestamp #2 .. 0.0228 ........ 0.0353
96-bit timestamp #3 .. 0.0210 ........ 0.0319
=============================================
Total                  1.6432          1.9674
Skipped                     4               4
Failed                      0               0
Ignored                     0               0

You may change default benchmark settings by defining the following environment variables:

NameDefault
MP_BENCH_TARGETSpure_p,pure_u, see a list of available targets
MP_BENCH_ITERATIONS100_000
MP_BENCH_DURATIONnot set
MP_BENCH_ROUNDS3
MP_BENCH_TESTS-@slow, see a list of available tests

For example:

export MP_BENCH_TARGETS=pure_p
export MP_BENCH_ITERATIONS=1000000
export MP_BENCH_ROUNDS=5
# a comma separated list of test names
export MP_BENCH_TESTS='complex array, complex map'
# or a group name
# export MP_BENCH_TESTS='-@slow' // @pecl_comp
# or a regexp
# export MP_BENCH_TESTS='/complex (array|map)/'

Another example, benchmarking both the library and the PECL extension:

MP_BENCH_TARGETS=pure_p,pure_u,pecl_p,pecl_u \
php -n -dextension=msgpack.so -dzend_extension=opcache.so \
-dpcre.jit=1 -dopcache.enable=1 -dopcache.enable_cli=1 \
tests/bench.php

Example output

Filter: MessagePack\Tests\Perf\Filter\ListFilter
Rounds: 3
Iterations: 100000

===========================================================================
Test/Target            Packer  BufferUnpacker  msgpack_pack  msgpack_unpack
---------------------------------------------------------------------------
nil .................. 0.0031 ........ 0.0141 ...... 0.0055 ........ 0.0064
false ................ 0.0039 ........ 0.0154 ...... 0.0056 ........ 0.0053
true ................. 0.0038 ........ 0.0139 ...... 0.0056 ........ 0.0044
7-bit uint #1 ........ 0.0061 ........ 0.0110 ...... 0.0059 ........ 0.0046
7-bit uint #2 ........ 0.0065 ........ 0.0119 ...... 0.0042 ........ 0.0029
7-bit uint #3 ........ 0.0054 ........ 0.0117 ...... 0.0045 ........ 0.0025
5-bit sint #1 ........ 0.0047 ........ 0.0103 ...... 0.0038 ........ 0.0022
5-bit sint #2 ........ 0.0048 ........ 0.0117 ...... 0.0038 ........ 0.0022
5-bit sint #3 ........ 0.0046 ........ 0.0102 ...... 0.0038 ........ 0.0023
8-bit uint #1 ........ 0.0063 ........ 0.0174 ...... 0.0039 ........ 0.0031
8-bit uint #2 ........ 0.0063 ........ 0.0167 ...... 0.0040 ........ 0.0029
8-bit uint #3 ........ 0.0063 ........ 0.0168 ...... 0.0039 ........ 0.0030
16-bit uint #1 ....... 0.0092 ........ 0.0222 ...... 0.0049 ........ 0.0030
16-bit uint #2 ....... 0.0096 ........ 0.0227 ...... 0.0042 ........ 0.0046
16-bit uint #3 ....... 0.0123 ........ 0.0274 ...... 0.0059 ........ 0.0051
32-bit uint #1 ....... 0.0136 ........ 0.0331 ...... 0.0060 ........ 0.0048
32-bit uint #2 ....... 0.0130 ........ 0.0336 ...... 0.0070 ........ 0.0048
32-bit uint #3 ....... 0.0127 ........ 0.0329 ...... 0.0051 ........ 0.0048
64-bit uint #1 ....... 0.0126 ........ 0.0268 ...... 0.0055 ........ 0.0049
64-bit uint #2 ....... 0.0135 ........ 0.0281 ...... 0.0052 ........ 0.0046
64-bit uint #3 ....... 0.0131 ........ 0.0274 ...... 0.0069 ........ 0.0044
8-bit int #1 ......... 0.0077 ........ 0.0236 ...... 0.0058 ........ 0.0044
8-bit int #2 ......... 0.0087 ........ 0.0244 ...... 0.0058 ........ 0.0048
8-bit int #3 ......... 0.0084 ........ 0.0241 ...... 0.0055 ........ 0.0049
16-bit int #1 ........ 0.0112 ........ 0.0271 ...... 0.0048 ........ 0.0045
16-bit int #2 ........ 0.0124 ........ 0.0292 ...... 0.0057 ........ 0.0049
16-bit int #3 ........ 0.0118 ........ 0.0270 ...... 0.0058 ........ 0.0050
32-bit int #1 ........ 0.0137 ........ 0.0366 ...... 0.0058 ........ 0.0051
32-bit int #2 ........ 0.0133 ........ 0.0366 ...... 0.0056 ........ 0.0049
32-bit int #3 ........ 0.0129 ........ 0.0350 ...... 0.0052 ........ 0.0048
64-bit int #1 ........ 0.0145 ........ 0.0254 ...... 0.0034 ........ 0.0025
64-bit int #2 ........ 0.0097 ........ 0.0214 ...... 0.0034 ........ 0.0025
64-bit int #3 ........ 0.0096 ........ 0.0287 ...... 0.0059 ........ 0.0050
64-bit int #4 ........ 0.0143 ........ 0.0277 ...... 0.0059 ........ 0.0046
64-bit float #1 ...... 0.0134 ........ 0.0281 ...... 0.0057 ........ 0.0052
64-bit float #2 ...... 0.0141 ........ 0.0281 ...... 0.0057 ........ 0.0050
64-bit float #3 ...... 0.0144 ........ 0.0282 ...... 0.0057 ........ 0.0050
fix string #1 ........ 0.0036 ........ 0.0143 ...... 0.0066 ........ 0.0053
fix string #2 ........ 0.0107 ........ 0.0222 ...... 0.0065 ........ 0.0068
fix string #3 ........ 0.0116 ........ 0.0245 ...... 0.0063 ........ 0.0069
fix string #4 ........ 0.0105 ........ 0.0253 ...... 0.0083 ........ 0.0077
8-bit string #1 ...... 0.0126 ........ 0.0318 ...... 0.0075 ........ 0.0088
8-bit string #2 ...... 0.0121 ........ 0.0295 ...... 0.0076 ........ 0.0086
8-bit string #3 ...... 0.0125 ........ 0.0293 ...... 0.0130 ........ 0.0093
16-bit string #1 ..... 0.0159 ........ 0.0368 ...... 0.0117 ........ 0.0086
16-bit string #2 ..... 0.1547 ........ 0.1686 ...... 0.1516 ........ 0.1373
32-bit string ........ 0.1558 ........ 0.1729 ...... 0.1511 ........ 0.1396
wide char string #1 .. 0.0098 ........ 0.0237 ...... 0.0066 ........ 0.0065
wide char string #2 .. 0.0128 ........ 0.0291 ...... 0.0061 ........ 0.0082
8-bit binary #1 ........... I ............. I ........... F ............. I
8-bit binary #2 ........... I ............. I ........... F ............. I
8-bit binary #3 ........... I ............. I ........... F ............. I
16-bit binary ............. I ............. I ........... F ............. I
32-bit binary ............. I ............. I ........... F ............. I
fix array #1 ......... 0.0040 ........ 0.0129 ...... 0.0120 ........ 0.0058
fix array #2 ......... 0.0279 ........ 0.0390 ...... 0.0143 ........ 0.0165
fix array #3 ......... 0.0415 ........ 0.0463 ...... 0.0162 ........ 0.0187
16-bit array #1 ...... 0.1349 ........ 0.1628 ...... 0.0334 ........ 0.0341
16-bit array #2 ........... S ............. S ........... S ............. S
32-bit array .............. S ............. S ........... S ............. S
complex array ............. I ............. I ........... F ............. F
fix map #1 ................ I ............. I ........... F ............. I
fix map #2 ........... 0.0345 ........ 0.0391 ...... 0.0143 ........ 0.0168
fix map #3 ................ I ............. I ........... F ............. I
fix map #4 ........... 0.0459 ........ 0.0473 ...... 0.0151 ........ 0.0163
16-bit map #1 ........ 0.2518 ........ 0.2962 ...... 0.0400 ........ 0.0490
16-bit map #2 ............. S ............. S ........... S ............. S
32-bit map ................ S ............. S ........... S ............. S
complex map .......... 0.2380 ........ 0.2682 ...... 0.0545 ........ 0.0579
fixext 1 .................. I ............. I ........... F ............. F
fixext 2 .................. I ............. I ........... F ............. F
fixext 4 .................. I ............. I ........... F ............. F
fixext 8 .................. I ............. I ........... F ............. F
fixext 16 ................. I ............. I ........... F ............. F
8-bit ext ................. I ............. I ........... F ............. F
16-bit ext ................ I ............. I ........... F ............. F
32-bit ext ................ I ............. I ........... F ............. F
32-bit timestamp #1 ....... I ............. I ........... F ............. F
32-bit timestamp #2 ....... I ............. I ........... F ............. F
64-bit timestamp #1 ....... I ............. I ........... F ............. F
64-bit timestamp #2 ....... I ............. I ........... F ............. F
64-bit timestamp #3 ....... I ............. I ........... F ............. F
96-bit timestamp #1 ....... I ............. I ........... F ............. F
96-bit timestamp #2 ....... I ............. I ........... F ............. F
96-bit timestamp #3 ....... I ............. I ........... F ............. F
===========================================================================
Total                  1.5625          2.3866        0.7735          0.7243
Skipped                     4               4             4               4
Failed                      0               0            24              17
Ignored                    24              24             0               7

With JIT:

MP_BENCH_TARGETS=pure_p,pure_u,pecl_p,pecl_u \
php -n -dextension=msgpack.so -dzend_extension=opcache.so \
-dpcre.jit=1 -dopcache.jit_buffer_size=64M -dopcache.jit=tracing -dopcache.enable=1 -dopcache.enable_cli=1 \
tests/bench.php

Example output

Filter: MessagePack\Tests\Perf\Filter\ListFilter
Rounds: 3
Iterations: 100000

===========================================================================
Test/Target            Packer  BufferUnpacker  msgpack_pack  msgpack_unpack
---------------------------------------------------------------------------
nil .................. 0.0001 ........ 0.0052 ...... 0.0053 ........ 0.0042
false ................ 0.0007 ........ 0.0060 ...... 0.0057 ........ 0.0043
true ................. 0.0008 ........ 0.0060 ...... 0.0056 ........ 0.0041
7-bit uint #1 ........ 0.0031 ........ 0.0046 ...... 0.0062 ........ 0.0041
7-bit uint #2 ........ 0.0021 ........ 0.0043 ...... 0.0062 ........ 0.0041
7-bit uint #3 ........ 0.0022 ........ 0.0044 ...... 0.0061 ........ 0.0040
5-bit sint #1 ........ 0.0030 ........ 0.0048 ...... 0.0062 ........ 0.0040
5-bit sint #2 ........ 0.0032 ........ 0.0046 ...... 0.0062 ........ 0.0040
5-bit sint #3 ........ 0.0031 ........ 0.0046 ...... 0.0062 ........ 0.0040
8-bit uint #1 ........ 0.0054 ........ 0.0079 ...... 0.0062 ........ 0.0050
8-bit uint #2 ........ 0.0051 ........ 0.0079 ...... 0.0064 ........ 0.0044
8-bit uint #3 ........ 0.0051 ........ 0.0082 ...... 0.0062 ........ 0.0044
16-bit uint #1 ....... 0.0077 ........ 0.0094 ...... 0.0065 ........ 0.0045
16-bit uint #2 ....... 0.0077 ........ 0.0094 ...... 0.0063 ........ 0.0045
16-bit uint #3 ....... 0.0077 ........ 0.0095 ...... 0.0064 ........ 0.0047
32-bit uint #1 ....... 0.0088 ........ 0.0119 ...... 0.0063 ........ 0.0043
32-bit uint #2 ....... 0.0089 ........ 0.0117 ...... 0.0062 ........ 0.0039
32-bit uint #3 ....... 0.0089 ........ 0.0118 ...... 0.0063 ........ 0.0044
64-bit uint #1 ....... 0.0097 ........ 0.0155 ...... 0.0063 ........ 0.0045
64-bit uint #2 ....... 0.0095 ........ 0.0153 ...... 0.0061 ........ 0.0045
64-bit uint #3 ....... 0.0096 ........ 0.0156 ...... 0.0063 ........ 0.0047
8-bit int #1 ......... 0.0053 ........ 0.0083 ...... 0.0062 ........ 0.0044
8-bit int #2 ......... 0.0052 ........ 0.0080 ...... 0.0062 ........ 0.0044
8-bit int #3 ......... 0.0052 ........ 0.0080 ...... 0.0062 ........ 0.0043
16-bit int #1 ........ 0.0089 ........ 0.0097 ...... 0.0069 ........ 0.0046
16-bit int #2 ........ 0.0075 ........ 0.0093 ...... 0.0063 ........ 0.0043
16-bit int #3 ........ 0.0075 ........ 0.0094 ...... 0.0062 ........ 0.0046
32-bit int #1 ........ 0.0086 ........ 0.0122 ...... 0.0063 ........ 0.0044
32-bit int #2 ........ 0.0087 ........ 0.0120 ...... 0.0066 ........ 0.0046
32-bit int #3 ........ 0.0086 ........ 0.0121 ...... 0.0060 ........ 0.0044
64-bit int #1 ........ 0.0096 ........ 0.0149 ...... 0.0060 ........ 0.0045
64-bit int #2 ........ 0.0096 ........ 0.0157 ...... 0.0062 ........ 0.0044
64-bit int #3 ........ 0.0096 ........ 0.0160 ...... 0.0063 ........ 0.0046
64-bit int #4 ........ 0.0097 ........ 0.0157 ...... 0.0061 ........ 0.0044
64-bit float #1 ...... 0.0079 ........ 0.0153 ...... 0.0056 ........ 0.0044
64-bit float #2 ...... 0.0079 ........ 0.0152 ...... 0.0057 ........ 0.0045
64-bit float #3 ...... 0.0079 ........ 0.0155 ...... 0.0057 ........ 0.0044
fix string #1 ........ 0.0010 ........ 0.0045 ...... 0.0071 ........ 0.0044
fix string #2 ........ 0.0048 ........ 0.0075 ...... 0.0070 ........ 0.0060
fix string #3 ........ 0.0048 ........ 0.0086 ...... 0.0068 ........ 0.0060
fix string #4 ........ 0.0050 ........ 0.0088 ...... 0.0070 ........ 0.0059
8-bit string #1 ...... 0.0081 ........ 0.0129 ...... 0.0069 ........ 0.0062
8-bit string #2 ...... 0.0086 ........ 0.0128 ...... 0.0069 ........ 0.0065
8-bit string #3 ...... 0.0086 ........ 0.0126 ...... 0.0115 ........ 0.0065
16-bit string #1 ..... 0.0105 ........ 0.0137 ...... 0.0128 ........ 0.0068
16-bit string #2 ..... 0.1510 ........ 0.1486 ...... 0.1526 ........ 0.1391
32-bit string ........ 0.1517 ........ 0.1475 ...... 0.1504 ........ 0.1370
wide char string #1 .. 0.0044 ........ 0.0085 ...... 0.0067 ........ 0.0057
wide char string #2 .. 0.0081 ........ 0.0125 ...... 0.0069 ........ 0.0063
8-bit binary #1 ........... I ............. I ........... F ............. I
8-bit binary #2 ........... I ............. I ........... F ............. I
8-bit binary #3 ........... I ............. I ........... F ............. I
16-bit binary ............. I ............. I ........... F ............. I
32-bit binary ............. I ............. I ........... F ............. I
fix array #1 ......... 0.0014 ........ 0.0059 ...... 0.0132 ........ 0.0055
fix array #2 ......... 0.0146 ........ 0.0156 ...... 0.0155 ........ 0.0148
fix array #3 ......... 0.0211 ........ 0.0229 ...... 0.0179 ........ 0.0180
16-bit array #1 ...... 0.0673 ........ 0.0498 ...... 0.0343 ........ 0.0388
16-bit array #2 ........... S ............. S ........... S ............. S
32-bit array .............. S ............. S ........... S ............. S
complex array ............. I ............. I ........... F ............. F
fix map #1 ................ I ............. I ........... F ............. I
fix map #2 ........... 0.0148 ........ 0.0180 ...... 0.0156 ........ 0.0179
fix map #3 ................ I ............. I ........... F ............. I
fix map #4 ........... 0.0252 ........ 0.0201 ...... 0.0214 ........ 0.0167
16-bit map #1 ........ 0.1027 ........ 0.0836 ...... 0.0388 ........ 0.0510
16-bit map #2 ............. S ............. S ........... S ............. S
32-bit map ................ S ............. S ........... S ............. S
complex map .......... 0.1104 ........ 0.1010 ...... 0.0556 ........ 0.0602
fixext 1 .................. I ............. I ........... F ............. F
fixext 2 .................. I ............. I ........... F ............. F
fixext 4 .................. I ............. I ........... F ............. F
fixext 8 .................. I ............. I ........... F ............. F
fixext 16 ................. I ............. I ........... F ............. F
8-bit ext ................. I ............. I ........... F ............. F
16-bit ext ................ I ............. I ........... F ............. F
32-bit ext ................ I ............. I ........... F ............. F
32-bit timestamp #1 ....... I ............. I ........... F ............. F
32-bit timestamp #2 ....... I ............. I ........... F ............. F
64-bit timestamp #1 ....... I ............. I ........... F ............. F
64-bit timestamp #2 ....... I ............. I ........... F ............. F
64-bit timestamp #3 ....... I ............. I ........... F ............. F
96-bit timestamp #1 ....... I ............. I ........... F ............. F
96-bit timestamp #2 ....... I ............. I ........... F ............. F
96-bit timestamp #3 ....... I ............. I ........... F ............. F
===========================================================================
Total                  0.9642          1.0909        0.8224          0.7213
Skipped                     4               4             4               4
Failed                      0               0            24              17
Ignored                    24              24             0               7

Note that the msgpack extension (v2.1.2) doesn't support ext, bin and UTF-8 str types.

License

The library is released under the MIT License. See the bundled LICENSE file for details.

Author: rybakit
Source Code: https://github.com/rybakit/msgpack.php
License: MIT License

#php 

Chatgpt-api: Node.js client for the official ChatGPT API

ChatGPT API

Node.js client for the official ChatGPT API.

Intro

This package is a Node.js wrapper around ChatGPT by OpenAI. TS batteries included. ✨

Example usage

Updates

March 1, 2023

The official OpenAI chat completions API has been released, and it is now the default for this package! 🔥

MethodFree?Robust?Quality?
ChatGPTAPI❌ No✅ Yes✅️ Real ChatGPT models
ChatGPTUnofficialProxyAPI✅ Yes☑️ Maybe✅ Real ChatGPT

Note: We strongly recommend using ChatGPTAPI since it uses the officially supported API from OpenAI. We may remove support for ChatGPTUnofficialProxyAPI in a future release.

  1. ChatGPTAPI - Uses the gpt-3.5-turbo-0301 model with the official OpenAI chat completions API (official, robust approach, but it's not free)
  2. ChatGPTUnofficialProxyAPI - Uses an unofficial proxy server to access ChatGPT's backend API in a way that circumvents Cloudflare (uses the real ChatGPT and is pretty lightweight, but relies on a third-party server and is rate-limited)

CLI

To run the CLI, you'll need an OpenAI API key:

export OPENAI_API_KEY="sk-TODO"
npx chatgpt "your prompt here"

By default, the response is streamed to stdout, the results are stored in a local config file, and every invocation starts a new conversation. You can use -c to continue the previous conversation and --no-stream to disable streaming.

Under the hood, the CLI uses ChatGPTAPI with text-davinci-003 to mimic ChatGPT.

Usage:
  $ chatgpt <prompt>

Commands:
  <prompt>  Ask ChatGPT a question
  rm-cache  Clears the local message cache
  ls-cache  Prints the local message cache path

For more info, run any command with the `--help` flag:
  $ chatgpt --help
  $ chatgpt rm-cache --help
  $ chatgpt ls-cache --help

Options:
  -c, --continue          Continue last conversation (default: false)
  -d, --debug             Enables debug logging (default: false)
  -s, --stream            Streams the response (default: true)
  -s, --store             Enables the local message cache (default: true)
  -t, --timeout           Timeout in milliseconds
  -k, --apiKey            OpenAI API key
  -n, --conversationName  Unique name for the conversation
  -h, --help              Display this message
  -v, --version           Display version number

Install

npm install chatgpt

Make sure you're using node >= 18 so fetch is available (or node >= 14 if you install a fetch polyfill).

Usage

To use this module from Node.js, you need to pick between two methods:

MethodFree?Robust?Quality?
ChatGPTAPI❌ No✅ Yes✅️ Real ChatGPT models
ChatGPTUnofficialProxyAPI✅ Yes☑️ Maybe✅ Real ChatGPT

ChatGPTAPI - Uses the gpt-3.5-turbo-0301 model with the official OpenAI chat completions API (official, robust approach, but it's not free). You can override the model, completion params, and system message to fully customize your assistant.

ChatGPTUnofficialProxyAPI - Uses an unofficial proxy server to access ChatGPT's backend API in a way that circumvents Cloudflare (uses the real ChatGPT and is pretty lightweight, but relies on a third-party server and is rate-limited)

Both approaches have very similar APIs, so it should be simple to swap between them.

Note: We strongly recommend using ChatGPTAPI since it uses the officially supported API from OpenAI. We may remove support for ChatGPTUnofficialProxyAPI in a future release.

Usage - ChatGPTAPI

Sign up for an OpenAI API key and store it in your environment.

import { ChatGPTAPI } from 'chatgpt'

async function example() {
  const api = new ChatGPTAPI({
    apiKey: process.env.OPENAI_API_KEY
  })

  const res = await api.sendMessage('Hello World!')
  console.log(res.text)
}

You can override the default model (gpt-3.5-turbo-0301) and any OpenAI chat completion params using completionParams:

const api = new ChatGPTAPI({
  apiKey: process.env.OPENAI_API_KEY,
  completionParams: {
    temperature: 0.5,
    top_p: 0.8
  }
})

If you want to track the conversation, you'll need to pass the parentMessageId like this:

const api = new ChatGPTAPI({ apiKey: process.env.OPENAI_API_KEY })

// send a message and wait for the response
let res = await api.sendMessage('What is OpenAI?')
console.log(res.text)

// send a follow-up
res = await api.sendMessage('Can you expand on that?', {
  parentMessageId: res.id
})
console.log(res.text)

// send another follow-up
res = await api.sendMessage('What were we talking about?', {
  parentMessageId: res.id
})
console.log(res.text)

You can add streaming via the onProgress handler:

const res = await api.sendMessage('Write a 500 word essay on frogs.', {
  // print the partial response as the AI is "typing"
  onProgress: (partialResponse) => console.log(partialResponse.text)
})

// print the full text at the end
console.log(res.text)

You can add a timeout using the timeoutMs option:

// timeout after 2 minutes (which will also abort the underlying HTTP request)
const response = await api.sendMessage(
  'write me a really really long essay on frogs',
  {
    timeoutMs: 2 * 60 * 1000
  }
)

If you want to see more info about what's actually being sent to OpenAI's chat completions API, set the debug: true option in the ChatGPTAPI constructor:

const api = new ChatGPTAPI({
  apiKey: process.env.OPENAI_API_KEY,
  debug: true
})

We default to a basic systemMessage. You can override this in either the ChatGPTAPI constructor or sendMessage:

const res = await api.sendMessage('what is the answer to the universe?', {
  systemMessage: `You are ChatGPT, a large language model trained by OpenAI. You answer as concisely as possible for each responseIf you are generating a list, do not have too many items.
Current date: ${new Date().toISOString()}\n\n`
})

Note that we automatically handle appending the previous messages to the prompt and attempt to optimize for the available tokens (which defaults to 4096).

Usage in CommonJS (Dynamic import)

async function example() {
  // To use ESM in CommonJS, you can use a dynamic import
  const { ChatGPTAPI } = await import('chatgpt')

  const api = new ChatGPTAPI({ apiKey: process.env.OPENAI_API_KEY })

  const res = await api.sendMessage('Hello World!')
  console.log(res.text)
}

Usage - ChatGPTUnofficialProxyAPI

The API for ChatGPTUnofficialProxyAPI is almost exactly the same. You just need to provide a ChatGPT accessToken instead of an OpenAI API key.

import { ChatGPTUnofficialProxyAPI } from 'chatgpt'

async function example() {
  const api = new ChatGPTUnofficialProxyAPI({
    accessToken: process.env.OPENAI_ACCESS_TOKEN
  })

  const res = await api.sendMessage('Hello World!')
  console.log(res.text)
}

See demos/demo-reverse-proxy for a full example:

npx tsx demos/demo-reverse-proxy.ts

ChatGPTUnofficialProxyAPI messages also contain a conversationid in addition to parentMessageId, since the ChatGPT webapp can't reference messages across

Reverse Proxy

You can override the reverse proxy by passing apiReverseProxyUrl:

const api = new ChatGPTUnofficialProxyAPI({
  accessToken: process.env.OPENAI_ACCESS_TOKEN,
  apiReverseProxyUrl: 'https://your-example-server.com/api/conversation'
})

Known reverse proxies run by community members include:

Reverse Proxy URLAuthorRate LimitsLast Checked
https://chat.duti.tech/api/conversation@acheong08120 req/min by IP2/19/2023
https://gpt.pawan.krd/backend-api/conversation@PawanOsman?2/19/2023

Note: info on how the reverse proxies work is not being published at this time in order to prevent OpenAI from disabling access.

Access Token

To use ChatGPTUnofficialProxyAPI, you'll need an OpenAI access token from the ChatGPT webapp. To do this, you can use any of the following methods which take an email and password and return an access token:

These libraries work with email + password accounts (e.g., they do not support accounts where you auth via Microsoft / Google).

Alternatively, you can manually get an accessToken by logging in to the ChatGPT webapp and then opening https://chat.openai.com/api/auth/session, which will return a JSON object containing your accessToken string.

Access tokens last for days.

Note: using a reverse proxy will expose your access token to a third-party. There shouldn't be any adverse effects possible from this, but please consider the risks before using this method.

Docs

See the auto-generated docs for more info on methods and parameters.

Demos

Most of the demos use ChatGPTAPI. It should be pretty easy to convert them to use ChatGPTUnofficialProxyAPI if you'd rather use that approach. The only thing that needs to change is how you initialize the api with an accessToken instead of an apiKey.

To run the included demos:

  1. clone repo
  2. install node deps
  3. set OPENAI_API_KEY in .env

A basic demo is included for testing purposes:

npx tsx demos/demo.ts

A demo showing on progress handler:

npx tsx demos/demo-on-progress.ts

The on progress demo uses the optional onProgress parameter to sendMessage to receive intermediary results as ChatGPT is "typing".

A conversation demo:

npx tsx demos/demo-conversation.ts

A persistence demo shows how to store messages in Redis for persistence:

npx tsx demos/demo-persistence.ts

Any keyv adaptor is supported for persistence, and there are overrides if you'd like to use a different way of storing / retrieving messages.

Note that persisting message is required for remembering the context of previous conversations beyond the scope of the current Node.js process, since by default, we only store messages in memory. Here's an external demo of using a completely custom database solution to persist messages.

Note: Persistence is handled automatically when using ChatGPTUnofficialProxyAPI because it is connecting indirectly to ChatGPT.

Projects

All of these awesome projects are built using the chatgpt package. 🤯

If you create a cool integration, feel free to open a PR and add it to the list.

Compatibility

  • This package is ESM-only.
  • This package supports node >= 14.
  • This module assumes that fetch is installed.
    • In node >= 18, it's installed by default.
    • In node < 18, you need to install a polyfill like unfetch/polyfill (guide) or isomorphic-fetch (guide).
  • If you want to build a website using chatgpt, we recommend using it only from your backend API

Credits


Previous Updates

Feb 19, 2023
 

We now provide three ways of accessing the unofficial ChatGPT API, all of which have tradeoffs:

MethodFree?Robust?Quality?
ChatGPTAPI❌ No✅ Yes☑️ Mimics ChatGPT
ChatGPTUnofficialProxyAPI✅ Yes☑️ Maybe✅ Real ChatGPT
ChatGPTAPIBrowser (v3)✅ Yes❌ No✅ Real ChatGPT

Note: I recommend that you use either ChatGPTAPI or ChatGPTUnofficialProxyAPI.

  1. ChatGPTAPI - Uses text-davinci-003 to mimic ChatGPT via the official OpenAI completions API (most robust approach, but it's not free and doesn't use a model fine-tuned for chat)
  2. ChatGPTUnofficialProxyAPI - Uses an unofficial proxy server to access ChatGPT's backend API in a way that circumvents Cloudflare (uses the real ChatGPT and is pretty lightweight, but relies on a third-party server and is rate-limited)
  3. ChatGPTAPIBrowser - (deprecated; v3.5.1 of this package) Uses Puppeteer to access the official ChatGPT webapp (uses the real ChatGPT, but very flaky, heavyweight, and error prone)

Feb 5, 2023
 

OpenAI has disabled the leaked chat model we were previously using, so we're now defaulting to text-davinci-003, which is not free.

We've found several other hidden, fine-tuned chat models, but OpenAI keeps disabling them, so we're searching for alternative workarounds.

Feb 1, 2023
 

This package no longer requires any browser hacks – it is now using the official OpenAI completions API with a leaked model that ChatGPT uses under the hood. 🔥

import { ChatGPTAPI } from 'chatgpt'

const api = new ChatGPTAPI({
  apiKey: process.env.OPENAI_API_KEY
})

const res = await api.sendMessage('Hello World!')
console.log(res.text)

Please upgrade to chatgpt@latest (at least v4.0.0). The updated version is significantly more lightweight and robust compared with previous versions. You also don't have to worry about IP issues or rate limiting.

Huge shoutout to @waylaidwanderer for discovering the leaked chat model!

If you run into any issues, we do have a pretty active Discord with a bunch of ChatGPT hackers from the Node.js & Python communities.

Lastly, please consider starring this repo and following me on twitter twitter to help support the project.

Thanks && cheers, Travis


Download Details:

Author: Transitive-bullshit
Source Code: https://github.com/transitive-bullshit/chatgpt-api 
License: MIT license

#chatgpt #api #node #AI #openai #chatbot 

ERIC  MACUS

ERIC MACUS

1647540000

Substrate Knowledge Map For Hackathon Participants

Substrate Knowledge Map for Hackathon Participants

The Substrate Knowledge Map provides information that you—as a Substrate hackathon participant—need to know to develop a non-trivial application for your hackathon submission.

The map covers 6 main sections:

  1. Introduction
  2. Basics
  3. Preliminaries
  4. Runtime Development
  5. Polkadot JS API
  6. Smart Contracts

Each section contains basic information on each topic, with links to additional documentation for you to dig deeper. Within each section, you'll find a mix of quizzes and labs to test your knowledge as your progress through the map. The goal of the labs and quizzes is to help you consolidate what you've learned and put it to practice with some hands-on activities.

Introduction

One question we often get is why learn the Substrate framework when we can write smart contracts to build decentralized applications?

The short answer is that using the Substrate framework and writing smart contracts are two different approaches.

Smart contract development

Traditional smart contract platforms allow users to publish additional logic on top of some core blockchain logic. Since smart contract logic can be published by anyone, including malicious actors and inexperienced developers, there are a number of intentional safeguards and restrictions built around these public smart contract platforms. For example:

Fees: Smart contract developers must ensure that contract users are charged for the computation and storage they impose on the computers running their contract. With fees, block creators are protected from abuse of the network.

Sandboxed: A contract is not able to modify core blockchain storage or storage items of other contracts directly. Its power is limited to only modifying its own state, and the ability to make outside calls to other contracts or runtime functions.

Reversion: Contracts can be prone to undesirable situations that lead to logical errors when wanting to revert or upgrade them. Developers need to learn additional patterns such as splitting their contract's logic and data to ensure seamless upgrades.

These safeguards and restrictions make running smart contracts slower and more costly. However, it's important to consider the different developer audiences for contract development versus Substrate runtime development.

Building decentralized applications with smart contracts allows your community to extend and develop on top of your runtime logic without worrying about proposals, runtime upgrades, and so on. You can also use smart contracts as a testing ground for future runtime changes, but done in an isolated way that protects your network from any errors the changes might introduce.

In summary, smart contract development:

  • Is inherently safer to the network.
  • Provides economic incentives and transaction fee mechanisms that can't be directly controlled by the smart contract author.
  • Provides computational overhead to support graceful logical failures.
  • Has a low barrier to entry for developers and enables a faster pace of community interaction.

Substrate runtime development

Unlike traditional smart contract development, Substrate runtime development offers none of the network protections or safeguards. Instead, as a runtime developer, you have total control over how the blockchain behaves. However, this level of control also means that there is a higher barrier to entry.

Substrate is a framework for building blockchains, which almost makes comparing it to smart contract development like comparing apples and oranges. With the Substrate framework, developers can build smart contracts but that is only a fraction of using Substrate to its full potential.

With Substrate, you have full control over the underlying logic that your network's nodes will run. You also have full access for modifying and controlling each and every storage item across your runtime modules. As you progress through this map, you'll discover concepts and techniques that will help you to unlock the potential of the Substrate framework, giving you the freedom to build the blockchain that best suits the needs of your application.

You'll also discover how you can upgrade the Substrate runtime with a single transaction instead of having to organize a community hard-fork. Upgradeability is one of the primary design features of the Substrate framework.

In summary, runtime development:

  • Provides low level access to your entire blockchain.
  • Removes the overhead of built-in safety for performance.
  • Has a higher barrier of entry for developers.
  • Provides flexibility to customize full-stack application logic.

To learn more about using smart contracts within Substrate, refer to the Smart Contract - Overview page as well as the Polkadot Builders Guide.

Navigating the documentation

If you need any community support, please join the following channels based on the area where you need help:

Alternatively, also look for support on Stackoverflow where questions are tagged with "substrate" or on the Parity Subport repo.

Use the following links to explore the sites and resources available on each:

Substrate Developer Hub has the most comprehensive all-round coverage about Substrate, from a "big picture" explanation of architecture to specific technical concepts. The site also provides tutorials to guide you as your learn the Substrate framework and the API reference documentation. You should check this site first if you want to look up information about Substrate runtime development. The site consists of:

Knowledge Base: Explaining the foundational concepts of building blockchain runtimes using Substrate.

Tutorials: Hand-on tutorials for developers to follow. The first SIX tutorials show the fundamentals in Substrate and are recommended for every Substrate learner to go through.

How-to Guides: These resources are like the O'Reilly cookbook series written in a task-oriented way for readers to get the job done. Some examples of the topics overed include:

  • Setting up proper weight functions for extrinsic calls.
  • Using off-chain workers to fetch HTTP requests.
  • Writing tests for your pallets It can also be read from

API docs: Substrate API reference documentation.

Substrate Node Template provides a light weight, minimal Substrate blockchain node that you can set up as a local development environment.

Substrate Front-end template provides a front-end interface built with React using Polkadot-JS API to connect to any Substrate node. Developers are encouraged to start new Substrate projects based on these templates.

If you face any technical difficulties and need support, feel free to join the Substrate Technical matrix channel and ask your questions there.

Additional resources

Polkadot Wiki documents the specific behavior and mechanisms of the Polkadot network. The Polkadot network allows multiple blockchains to connect and pass messages to each other. On the wiki, you can learn about how Polkadot—built using Substrate—is customized to support inter-blockchain message passing.

Polkadot JS API doc: documents how to use the Polkadot-JS API. This JavaScript-based API allows developers to build custom front-ends for their blockchains and applications. Polkadot JS API provides a way to connect to Substrate-based blockchains to query runtime metadata and send transactions.

Quiz #1

👉 Submit your answers to Quiz #1

Basics

Set up your local development environment

Here you will set up your local machine to install the Rust compiler—ensuring that you have both stable and nightly versions installed. Both stable and nightly versions are required because currently a Substrate runtime is compiled to a native binary using the stable Rust compiler, then compiled to a WebAssembly (WASM) binary, which only the nightly Rust compiler can do.

Also refer to:

Lab #1

👉 Complete Lab #1: Run a Substrate node

Interact with a Substrate network using Polkadot-JS apps

Polkadot JS Apps is the canonical front-end to interact with any Substrate-based chain.

You can configure whichever endpoint you want it to connected to, even to your localhost running node. Refer to the following two diagrams.

  1. Click on the top left side showing your currently connected network:

assets/01-polkadot-app-endpoint.png

  1. Scroll to the bottom of the menu, open DEVELOPMENT, and choose either Local Node or Custom to specify your own endpoint.

assets/02-polkadot-app-select-endpoint.png

Quiz #2

👉 Complete Quiz #2

Lab #2

👉 Complete Lab #2: Using Polkadot-JS Apps

Notes: If you are connecting Apps to a custom chain (or your locally-running node), you may need to specify your chain's custom data types in JSON under Settings > Developer.

Polkadot-JS Apps only receives a series of bytes from the blockchain. It is up to the developer to tell it how to decode and interpret these custom data type. To learn more on this, refer to:

You will also need to create an account. To do so, follow these steps on account generation. You'll learn that you can also use the Polkadot-JS Browser Plugin (a Metamask-like browser extension to manage your Substrate accounts) and it will automatically be imported into Polkadot-JS Apps.

Notes: When you run a Substrate chain in development mode (with the --dev flag), well-known accounts (Alice, Bob, Charlie, etc.) are always created for you.

Lab #3

👉 Complete Lab #3: Create an Account

Preliminaries

You need to know some Rust programming concepts and have a good understanding on how blockchain technology works in order to make the most of developing with Substrate. The following resources will help you brush up in these areas.

Rust

You will need familiarize yourself with Rust to understand how Substrate is built and how to make the most of its capabilities.

If you are new to Rust, or need a brush up on your Rust knowledge, please refer to The Rust Book. You could still continue learning about Substrate without knowing Rust, but we recommend you come back to this section whenever in doubt about what any of the Rust syntax you're looking at means. Here are the parts of the Rust book we recommend you familiarize yourself with:

  • ch 1 - 10: These chapters cover the foundational knowledge of programming in Rust
  • ch 13: On iterators and closures
  • ch 18 - 19: On advanced traits and advanced types. Learn a bit about macros as well. You will not necessarily be writing your own macros, but you'll be using a lot of Substrate and FRAME's built-in macros to write your blockchain runtime.

How blockchains work

Given that you'll be writing a blockchain runtime, you need to know what a blockchain is, and how it works. The **Web3 Blockchain Fundamental MOOC Youtube video series provides a good basis for understanding key blockchain concepts and how blockchains work.

The lectures we recommend you watch are: lectures 1 - 7 and lecture 10. That's 8 lectures, or about 4 hours of video.

Quiz #3

👉 Complete Quiz #3

Substrate runtime development

High level architecture

To know more about the high level architecture of Substrate, please go through the Knowledge Base articles on Getting Started: Overview and Getting Started: Architecture.

In this document, we assume you will develop a Substrate runtime with FRAME (v2). This is what a Substrate node consists of.

assets/03-substrate-architecture.png

Each node has many components that manage things like the transaction queue, communicating over a P2P network, reaching consensus on the state of the blockchain, and the chain's actual runtime logic (aka the blockchain runtime). Each aspect of the node is interesting in its own right, and the runtime is particularly interesting because it contains the business logic (aka "state transition function") that codifies the chain's functionality. The runtime contains a collection of pallets that are configured to work together.

On the node level, Substrate leverages libp2p for the p2p networking layer and puts the transaction pool, consensus mechanism, and underlying data storage (a key-value database) on the node level. These components all work "under the hood", and in this knowledge map we won't cover them in detail except for mentioning their existence.

Quiz #4

👉 Complete Quiz #4

Runtime development topics

In our Developer Hub, we have a thorough coverage on various subjects you need to know to develop with Substrate. So here we just list out the key topics and reference back to Developer Hub. Please go through the following key concepts and the directed resources to know the fundamentals of runtime development.

Key Concept: Runtime, this is where the blockchain state transition function (the blockchain application-specific logic) is defined. It is about composing multiple pallets (can be understood as Rust modules) together in the runtime and hooking them up together.

Runtime Development: Execution, this article describes how a block is produced, and how transactions are selected and executed to reach the next "stage" in the blockchain.

Runtime Develpment: Pallets, this article describes what the basic structure of a Substrate pallet is consists of.

Runtime Development: FRAME, this article gives a high level overview of the system pallets Substrate already implements to help you quickly develop as a runtime engineer. Have a quick skim so you have a basic idea of the different pallets Substrate is made of.

Lab #4

👉 Complete Lab #4: Adding a Pallet into a Runtime

Runtime Development: Storage, this article describes how data is stored on-chain and how you could access them.

Runtime Development: Events & Errors, this page describe how external parties know what has happened in the blockchain, via the emitted events and errors when executing transactions.

Notes: All of the above concepts we leverage on the #[pallet::*] macro to define them in the code. If you are interested to learn more about what other types of pallet macros exist go to the FRAME macro API documentation and this doc on some frequently used Substrate macros.

Lab #5

👉 Complete Lab #5: Building a Proof-of-Existence dApp

Lab #6

👉 Complete Lab #6: Building a Substrate Kitties dApp

Quiz #5

👉 Complete Quiz #5

Polkadot JS API

Polkadot JS API is the javascript API for Substrate. By using it you can build a javascript front end or utility and interact with any Substrate-based blockchain.

The Substrate Front-end Template is an example of using Polkadot JS API in a React front-end.

  • Runtime Development: Metadata, this article describes the API allowing external parties to query what API is open for the chain. Polkadot JS API makes use of a chain's metadata to know what queries and functions are available from a chain to call.

Lab #7

👉 Complete Lab #7: Using Polkadot-JS API

Quiz #6

👉 Complete Quiz #6: Using Polkadot-JS API

Smart contracts

Learn about the difference between smart contract development vs Substrate runtime development, and when to use each here.

In Substrate, you can program smart contracts using ink!.

Quiz #7

👉 Complete Quiz #7: Using ink!

What we do not cover

A lot 😄

On-chain runtime upgrades. We have a tutorial on On-chain (forkless) Runtime Upgrade. This tutorial introduces how to perform and schedule a runtime upgrade as an on-chain transaction.

About transaction weight and fee, and benchmarking your runtime to determine the proper transaction cost.

Off-chain Features

There are certain limits to on-chain logic. For instance, computation cannot be too intensive that it affects the block output time, and computation must be deterministic. This means that computation that relies on external data fetching cannot be done on-chain. In Substrate, developers can run these types of computation off-chain and have the result sent back on-chain via extrinsics.

Tightly- and Loosely-coupled pallets, calling one pallet's functions from another pallet via trait specification.

Blockchain Consensus Mechansim, and a guide on customizing it to proof-of-work here.

Parachains: one key feature of Substrate is the capability of becoming a parachain for relay chains like Polkadot. You can develop your own application-specific logic in your chain and rely on the validator community of the relay chain to secure your network, instead of building another validator community yourself. Learn more with the following resources:

Terms clarification

  • Substrate: the blockchain development framework built for writing highly customized, domain-specific blockchains.
  • Polkadot: Polkadot is the relay chain blockchain, built with Substrate.
  • Kusama: Kusama is Polkadot's canary network, used to launch features before these features are launched on Polkadot. You could view it as a beta-network with real economic value where the state of the blockchain is never reset.
  • Web 3.0: is the decentralized internet ecosystem that, instead of apps being centrally stored in a few servers and managed by a sovereign party, it is an open, trustless, and permissionless network when apps are not controlled by a centralized entity.
  • Web3 Foundation: A foundation setup to support the development of decentralized web software protocols. Learn more about what they do on thier website.

Others


Author: substrate-developer-hub
Source Code: https://github.com/substrate-developer-hub/hackathon-knowledge-map
License: 

#blockchain #substrate 

Roberta  Ward

Roberta Ward

1595344320

Wondering how to upgrade your skills in the pandemic? Here's a simple way you can do it.

Corona Virus Pandemic has brought the world to a standstill.

Countries are on a major lockdown. Schools, colleges, theatres, gym, clubs, and all other public places are shut down, the country’s economy is suffering, human health is on stake, people are losing their jobs and nobody knows how worse it can get.

Since most of the places are on lockdown, and you are working from home or have enough time to nourish your skills, then you should use this time wisely! We always complain that we want some ‘time’ to learn and upgrade our knowledge but don’t get it due to our ‘busy schedules’. So, now is the time to make a ‘list of skills’ and learn and upgrade your skills at home!

And for the technology-loving people like us, Knoldus Techhub has already helped us a lot in doing it in a short span of time!

If you are still not aware of it, don’t worry as Georgia Byng has well said,

“No time is better than the present”

– Georgia Byng, a British children’s writer, illustrator, actress and film producer.

No matter if you are a developer (be it front-end or back-end) or a data scientisttester, or a DevOps person, or, a learner who has a keen interest in technology, Knoldus Techhub has brought it all for you under one common roof.

From technologies like Scala, spark, elastic-search to angular, go, machine learning, it has a total of 20 technologies with some recently added ones i.e. DAML, test automation, snowflake, and ionic.

How to upgrade your skills?

Every technology in Tech-hub has n number of templates. Once you click on any specific technology you’ll be able to see all the templates of that technology. Since these templates are downloadable, you need to provide your email to get the template downloadable link in your mail.

These templates helps you learn the practical implementation of a topic with so much of ease. Using these templates you can learn and kick-start your development in no time.

Apart from your learning, there are some out of the box templates, that can help provide the solution to your business problem that has all the basic dependencies/ implementations already plugged in. Tech hub names these templates as xlr8rs (pronounced as accelerators).

xlr8rs make your development real fast by just adding your core business logic to the template.

If you are looking for a template that’s not available, you can also request a template may be for learning or requesting for a solution to your business problem and tech-hub will connect with you to provide you the solution. Isn’t this helpful 🙂

Confused with which technology to start with?

To keep you updated, the Knoldus tech hub provides you with the information on the most trending technology and the most downloaded templates at present. This you’ll be informed and learn the one that’s most trending.

Since we believe:

“There’s always a scope of improvement“

If you still feel like it isn’t helping you in learning and development, you can provide your feedback in the feedback section in the bottom right corner of the website.

#ai #akka #akka-http #akka-streams #amazon ec2 #angular 6 #angular 9 #angular material #apache flink #apache kafka #apache spark #api testing #artificial intelligence #aws #aws services #big data and fast data #blockchain #css #daml #devops #elasticsearch #flink #functional programming #future #grpc #html #hybrid application development #ionic framework #java #java11 #kubernetes #lagom #microservices #ml # ai and data engineering #mlflow #mlops #mobile development #mongodb #non-blocking #nosql #play #play 2.4.x #play framework #python #react #reactive application #reactive architecture #reactive programming #rust #scala #scalatest #slick #software #spark #spring boot #sql #streaming #tech blogs #testing #user interface (ui) #web #web application #web designing #angular #coronavirus #daml #development #devops #elasticsearch #golang #ionic #java #kafka #knoldus #lagom #learn #machine learning #ml #pandemic #play framework #scala #skills #snowflake #spark streaming #techhub #technology #test automation #time management #upgrade

Python  Library

Python Library

1639446480

Building APIs with Python & FastApi (A Modern, Fast, Web Framework)

FastAPI

FastAPI framework, high performance, easy to learn, fast to code, ready for production 


Documentation: https://fastapi.tiangolo.com

Source Code: https://github.com/tiangolo/fastapi


FastAPI is a modern, fast (high-performance), web framework for building APIs with Python 3.6+ based on standard Python type hints.

The key features are:

  • Fast: Very high performance, on par with NodeJS and Go (thanks to Starlette and Pydantic). One of the fastest Python frameworks available.
  • Fast to code: Increase the speed to develop features by about 200% to 300%. *
  • Fewer bugs: Reduce about 40% of human (developer) induced errors. *
  • Intuitive: Great editor support. Completion everywhere. Less time debugging.
  • Easy: Designed to be easy to use and learn. Less time reading docs.
  • Short: Minimize code duplication. Multiple features from each parameter declaration. Fewer bugs.
  • Robust: Get production-ready code. With automatic interactive documentation.
  • Standards-based: Based on (and fully compatible with) the open standards for APIs: OpenAPI (previously known as Swagger) and JSON Schema.

* estimation based on tests on an internal development team, building production applications.

Sponsors

Other sponsors

Opinions

"[...] I'm using FastAPI a ton these days. [...] I'm actually planning to use it for all of my team's ML services at Microsoft. Some of them are getting integrated into the core Windows product and some Office products."

Kabir Khan - Microsoft (ref)


"We adopted the FastAPI library to spawn a REST server that can be queried to obtain predictions. [for Ludwig]"

Piero Molino, Yaroslav Dudin, and Sai Sumanth Miryala - Uber (ref)


"Netflix is pleased to announce the open-source release of our crisis management orchestration framework: Dispatch! [built with FastAPI]"

Kevin Glisson, Marc Vilanova, Forest Monsen - Netflix (ref)


"I’m over the moon excited about FastAPI. It’s so fun!"

Brian Okken - Python Bytes podcast host (ref)


"Honestly, what you've built looks super solid and polished. In many ways, it's what I wanted Hug to be - it's really inspiring to see someone build that."

Timothy Crosley - Hug creator (ref)


"If you're looking to learn one modern framework for building REST APIs, check out FastAPI [...] It's fast, easy to use and easy to learn [...]"

"We've switched over to FastAPI for our APIs [...] I think you'll like it [...]"

Ines Montani - Matthew Honnibal - Explosion AI founders - spaCy creators (ref) - (ref)


Typer, the FastAPI of CLIs

If you are building a CLI app to be used in the terminal instead of a web API, check out Typer.

Typer is FastAPI's little sibling. And it's intended to be the FastAPI of CLIs. ⌨️ 🚀

Requirements

Python 3.6+

FastAPI stands on the shoulders of giants:

Installation

$ pip install fastapi

---> 100%

You will also need an ASGI server, for production such as Uvicorn or Hypercorn.

$ pip install "uvicorn[standard]"

---> 100%

Example

Create it

  • Create a file main.py with:
from typing import Optional

from fastapi import FastAPI

app = FastAPI()


@app.get("/")
def read_root():
    return {"Hello": "World"}


@app.get("/items/{item_id}")
def read_item(item_id: int, q: Optional[str] = None):
    return {"item_id": item_id, "q": q}

Or use async def...

If your code uses async / await, use async def:

from typing import Optional

from fastapi import FastAPI

app = FastAPI()


@app.get("/")
async def read_root():
    return {"Hello": "World"}


@app.get("/items/{item_id}")
async def read_item(item_id: int, q: Optional[str] = None):
    return {"item_id": item_id, "q": q}

Note:

If you don't know, check the "In a hurry?" section about async and await in the docs.

Run it

Run the server with:

$ uvicorn main:app --reload

INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO:     Started reloader process [28720]
INFO:     Started server process [28722]
INFO:     Waiting for application startup.
INFO:     Application startup complete.

About the command uvicorn main:app --reload...

The command uvicorn main:app refers to:

  • main: the file main.py (the Python "module").
  • app: the object created inside of main.py with the line app = FastAPI().
  • --reload: make the server restart after code changes. Only do this for development.

Check it

Open your browser at http://127.0.0.1:8000/items/5?q=somequery.

You will see the JSON response as:

{"item_id": 5, "q": "somequery"}

You already created an API that:

  • Receives HTTP requests in the paths / and /items/{item_id}.
  • Both paths take GET operations (also known as HTTP methods).
  • The path /items/{item_id} has a path parameter item_id that should be an int.
  • The path /items/{item_id} has an optional str query parameter q.

Interactive API docs

Now go to http://127.0.0.1:8000/docs.

You will see the automatic interactive API documentation (provided by Swagger UI):

Swagger UI

Alternative API docs

And now, go to http://127.0.0.1:8000/redoc.

You will see the alternative automatic documentation (provided by ReDoc):

ReDoc

Example upgrade

Now modify the file main.py to receive a body from a PUT request.

Declare the body using standard Python types, thanks to Pydantic.

from typing import Optional

from fastapi import FastAPI
from pydantic import BaseModel

app = FastAPI()


class Item(BaseModel):
    name: str
    price: float
    is_offer: Optional[bool] = None


@app.get("/")
def read_root():
    return {"Hello": "World"}


@app.get("/items/{item_id}")
def read_item(item_id: int, q: Optional[str] = None):
    return {"item_id": item_id, "q": q}


@app.put("/items/{item_id}")
def update_item(item_id: int, item: Item):
    return {"item_name": item.name, "item_id": item_id}

The server should reload automatically (because you added --reload to the uvicorn command above).

Interactive API docs upgrade

Now go to http://127.0.0.1:8000/docs.

  • The interactive API documentation will be automatically updated, including the new body:

Swagger UI

  • Click on the button "Try it out", it allows you to fill the parameters and directly interact with the API:

Swagger UI interaction

  • Then click on the "Execute" button, the user interface will communicate with your API, send the parameters, get the results and show them on the screen:

Swagger UI interaction

Alternative API docs upgrade

And now, go to http://127.0.0.1:8000/redoc.

  • The alternative documentation will also reflect the new query parameter and body:

ReDoc

Recap

In summary, you declare once the types of parameters, body, etc. as function parameters.

You do that with standard modern Python types.

You don't have to learn a new syntax, the methods or classes of a specific library, etc.

Just standard Python 3.6+.

For example, for an int:

item_id: int

or for a more complex Item model:

item: Item

...and with that single declaration you get:

  • Editor support, including:
    • Completion.
    • Type checks.
  • Validation of data:
    • Automatic and clear errors when the data is invalid.
    • Validation even for deeply nested JSON objects.
  • Conversion of input data: coming from the network to Python data and types. Reading from:
    • JSON.
    • Path parameters.
    • Query parameters.
    • Cookies.
    • Headers.
    • Forms.
    • Files.
  • Conversion of output data: converting from Python data and types to network data (as JSON):
    • Convert Python types (str, int, float, bool, list, etc).
    • datetime objects.
    • UUID objects.
    • Database models.
    • ...and many more.
  • Automatic interactive API documentation, including 2 alternative user interfaces:
    • Swagger UI.
    • ReDoc.

Coming back to the previous code example, FastAPI will:

  • Validate that there is an item_id in the path for GET and PUT requests.
  • Validate that the item_id is of type int for GET and PUT requests.
    • If it is not, the client will see a useful, clear error.
  • Check if there is an optional query parameter named q (as in http://127.0.0.1:8000/items/foo?q=somequery) for GET requests.
    • As the q parameter is declared with = None, it is optional.
    • Without the None it would be required (as is the body in the case with PUT).
  • For PUT requests to /items/{item_id}, Read the body as JSON:
    • Check that it has a required attribute name that should be a str.
    • Check that it has a required attribute price that has to be a float.
    • Check that it has an optional attribute is_offer, that should be a bool, if present.
    • All this would also work for deeply nested JSON objects.
  • Convert from and to JSON automatically.
  • Document everything with OpenAPI, that can be used by:
    • Interactive documentation systems.
    • Automatic client code generation systems, for many languages.
  • Provide 2 interactive documentation web interfaces directly.

We just scratched the surface, but you already get the idea of how it all works.

Try changing the line with:

    return {"item_name": item.name, "item_id": item_id}

...from:

        ... "item_name": item.name ...

...to:

        ... "item_price": item.price ...

...and see how your editor will auto-complete the attributes and know their types:

editor support

For a more complete example including more features, see the Tutorial - User Guide.

Spoiler alert: the tutorial - user guide includes:

  • Declaration of parameters from other different places as: headers, cookies, form fields and files.
  • How to set validation constraints as maximum_length or regex.
  • A very powerful and easy to use Dependency Injection system.
  • Security and authentication, including support for OAuth2 with JWT tokens and HTTP Basic auth.
  • More advanced (but equally easy) techniques for declaring deeply nested JSON models (thanks to Pydantic).
  • GraphQL integration with Strawberry and other libraries.
  • Many extra features (thanks to Starlette) as:
    • WebSockets
    • extremely easy tests based on requests and pytest
    • CORS
    • Cookie Sessions
    • ...and more.

Performance

Independent TechEmpower benchmarks show FastAPI applications running under Uvicorn as one of the fastest Python frameworks available, only below Starlette and Uvicorn themselves (used internally by FastAPI). (*)

To understand more about it, see the section Benchmarks.

Optional Dependencies

Used by Pydantic:

Used by Starlette:

  • requests - Required if you want to use the TestClient.
  • jinja2 - Required if you want to use the default template configuration.
  • python-multipart - Required if you want to support form "parsing", with request.form().
  • itsdangerous - Required for SessionMiddleware support.
  • pyyaml - Required for Starlette's SchemaGenerator support (you probably don't need it with FastAPI).
  • ujson - Required if you want to use UJSONResponse.

Used by FastAPI / Starlette:

  • uvicorn - for the server that loads and serves your application.
  • orjson - Required if you want to use ORJSONResponse.

You can install all of these with pip install "fastapi[all]".

Download Details:
Author: tiangolo
Source Code: https://github.com/tiangolo/fastapi
License: MIT License

#python #api #fastapi