1684619940
A rust implementation of gRPC, a high performance, open source, general RPC framework that puts mobile and HTTP/2 first.
tonic
is a gRPC over HTTP/2 implementation focused on high performance, interoperability, and flexibility. This library was created to have first class support of async/await and to act as a core building block for production systems written in Rust.
tonic
is composed of three main components: the generic gRPC implementation, the high performance HTTP/2 implementation and the codegen powered by prost
. The generic implementation can support any HTTP/2 implementation and any encoding via a set of generic traits. The HTTP/2 implementation is based on hyper
, a fast HTTP/1.1 and HTTP/2 client and server built on top of the robust tokio
stack. The codegen contains the tools to build clients and servers from protobuf
definitions.
rustls
Examples can be found in examples
and for more complex scenarios interop
may be a good resource as it shows examples of many of the gRPC features.
If you're using rust-analyzer we recommend you set "rust-analyzer.cargo.buildScripts.enable": true
to correctly load the generated code.
For IntelliJ IDEA users, please refer to this and enable org.rust.cargo.evaluate.build.scripts
experimental feature.
tonic
's MSRV is 1.60
.
$ rustup update
$ cargo build
In order to build tonic
>= 0.8.0, you need the protoc
Protocol Buffers compiler, along with Protocol Buffers resource files.
sudo apt update && sudo apt upgrade -y
sudo apt install -y protobuf-compiler libprotobuf-dev
sudo apk add protoc protobuf-dev
Assuming Homebrew is already installed. (If not, see instructions for installing Homebrew on the Homebrew website.)
brew install protobuf
protoc-xx.y-win64.zip
from HEREbin\protoc.exe
and put it somewhere in the PATH
protoc --version
helloworld
tutorial provides a basic example of using tonic
, perfect for first time users!routeguide
tutorial provides a complete example of using tonic
and all its features.First, see if the answer to your question can be found in the API documentation. If the answer is not there, there is an active community in the Tonic Discord channel. We would be happy to try to answer your question. If that doesn't work, try opening an issue with the question.
tonic
: Generic gRPC and HTTP/2 client/server implementation.tonic-build
: prost
based service codegen.tonic-types
: prost
based grpc utility types including support for gRPC Well Known Types.tonic-health
: Implementation of the standard gRPC health checking service. Also serves as an example of both unary and response streaming.tonic-reflection
: A tonic based gRPC reflection implementation.examples
: Example gRPC implementations showing off tls, load balancing and bi-directional streaming.interop
: Interop tests implementation.π Thanks for your help improving the project! We are so happy to have you! We have a contributing guide to help you get involved in the Tonic project.
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in Tonic by you, shall be licensed as MIT, without any additional terms or conditions.
Examples | Website | Docs | Chat
Author: Hyperium
Source Code: https://github.com/hyperium/tonic
License: MIT license
1684420952
Utility widgets for standard future and stream uses
This project is a starting point for a Dart package, a library module containing code that can be shared easily across multiple Flutter or Dart projects.
For help getting started with Flutter, view our online documentation, which offers tutorials, samples, guidance on mobile development, and a full API reference.
Run this command:
With Flutter:
$ flutter pub add adv_async_widget
This will add a line like this to your package's pubspec.yaml (and run an implicit flutter pub get
):
dependencies:
adv_async_widget: ^0.1.0
Alternatively, your editor might support flutter pub get
. Check the docs for your editor to learn more.
Now in your Dart code, you can use:
import 'package:adv_async_widget/adv_async_widget.dart';
import 'package:adv_async_widget/adv_async_widget.dart';
import 'package:flutter/material.dart';
void main() {
runApp(MyApp());
}
const TITLE = "AdvAsyncWidget Demo";
class MyApp extends StatefulWidget {
@override
State<StatefulWidget> createState() => MyAppState();
}
class MyAppState extends State<MyApp> {
late final Future<String> exampleFuture;
@override
void initState() {
super.initState();
exampleFuture = Future.delayed(Duration(seconds: 5), () => "FINISHED");
}
@override
Widget build(BuildContext context) => MaterialApp(
title: TITLE,
home: Scaffold(
appBar: AppBar(
title: Text(TITLE),
),
body: AdvFutureBuilder<String>(
future: exampleFuture,
onWait: (context) => CircularProgressIndicator(),
onData: (context, data) => Text(data!),
),
),
);
}
Download Details:
Author: oznecniV97
Source Code: https://github.com/oznecniV97/adv_async_widget
1679951100
=====
Async is a set of c++ primitives that allows you efficient and rapid development in c++17 on linux systems. The focus is mostly for backend development, data processing etc.
I will gradually add explanations for most crucial blocks in this library.
> sudo ./install-dependencies.sh
> ./blaze.sh -ninja -release
> cd build-opt && ninja -j4 echo_server
third_party folder is checked out under build directories.
Then, from 2 tabs run:
server> ./echo_server --logtostderr
client> ./echo_server --connect=localhost --n 100000 --c=4
HTTP handler is implemented using Boost.Beast library. It's integrated with the io_uring based ProactorPool. Please see http_main.cc, for example. HTTP also provides support for backend monitoring (Varz status page) and for extensible debugging interface. With monitoring C++ backend returns json object that is formatted inside status page in the browser. To check how it looks, please go to localhost:8080 while echo_server
is running.
Every http-powered backend has integrated CPU profiling capabilities using gperf-tools and pprof Profiling can be trigerred in prod using magic-url commands. Enabled profiling usually has very minimal impact on cpu performance of the running backend.
Logging is based on Google's glog library. The library is very reliable, performant and solid. It has many features that allow resilient backend development. Unfortunately, Google's version has some bugs, which I fixed (waiting for review...), so I use my own fork. Glog library gives me the ability to control logging levels of a backend at run-time without restarting it.
ASYNC uses googletest+gmock unit-test environment.
Third_party packages have TRDP::
prefix in CMakeLists.txt
. absl libraries have prefix absl::...
.
Author: Romange
Source Code: https://github.com/romange/helio
License: Apache-2.0 license
1679214360
Telegram MTProto API Framework for Python
Elegant, modern and asynchronous Telegram MTProto API framework in Python for users and bots
from pyrogram import Client, filters
app = Client("my_account")
@app.on_message(filters.private)
async def hello(client, message):
await message.reply("Hello from Pyrogram!")
app.run()
Pyrogram is a modern, elegant and asynchronous MTProto API framework. It enables you to easily interact with the main Telegram API through a user account (custom client) or a bot identity (bot API alternative) using Python.
If you'd like to support Pyrogram, you can consider:
pip3 install pyrogram
Author: Pyrogram
Source Code: https://github.com/pyrogram/pyrogram
License: LGPL-3.0, GPL-3.0 licenses found
1676005167
Async and await are an important part of JavaScript and are essential for handling asynchronous programming in a readable and maintainable way. By using async and await, you can write asynchronous code that looks and behaves like synchronous code and handle errors in a familiar way using try and catch.
https://codewithnazam.com/2023/02/09/javascript-async-await-with-examples/
#javascript #async #await #tryandcatch
1675765765
Have you ever dream to write asynchronous code like its synchronous counterpart?
AwaitKit is a powerful Swift library inspired by the Async/Await specification in ES8 (ECMAScript 2017) which provides a powerful way to write asynchronous code in a sequential manner.
Internally it uses PromiseKit v6.10 to create and manage promises.
If you want have a quick overview of the project take a look to this blog post.
Put simply, write this:
let user = try! await(signIn(username: "Foo", password: "Bar"))
try! await(sendWelcomeMailToUser(user))
try! await(redirectToThankYouScreen())
print("All done!")
Instead of:
signIn(username: "Foo", password: "Bar")
.then { user in
return self.sendWelcomeMailToUser(user)
}
.then { _ in
return self.redirectToThankYouScreen()
}
.then { _ in
print("All done!")
}
Or worse, using the completion block imbrication hell style:
signIn(username: "Foo", password: "Bar") { user in
self.sendWelcomeMailToUser(user) { _ in
self.redirectToThankYouScreen() { _ in
print("All done!")
}
}
}
The async
method yields the execution to its closure which will run in a background queue and returns a promise which will be resolved at this end of block.
Here a small example :
func setupNewUser(name: String) -> Promise<User> {
return async {
let newUser = try await(self.createUser(name))
let friends = try await(self.getFacebookFriends(name))
newUser.addFriends(friends)
return newUser
}
}
Here the setupNewUser
returns a promise with a user as value. If the end of async
block is executed the promise will be resolved, otherwise if an error occurred inside the async block the promise will be rejected with the corresponding error.
The async
block will catch the error thrown to reject the promise so you don't need to manage the await
exceptions. But if necessary, you can:
async {
do {
try await(self.loginOrThrown(username: "yannickl"))
}
catch {
print(error)
}
try await(self.clearCache())
}
The await
method will executes the given promise or block and await until it resolved or failed.
do {
let name: String = try await {
Thread.sleep(forTimeInterval: 0.2)
if Int(arc4random_uniform(2) + 1) % 2 == 0 {
return "yannickl"
}
else {
throw NSError()
}
}
print(name)
}
catch {
print(error)
}
The async
and await
methods runs by default on a background concurrent queue. Of course, you can choose your own queues and call the following methods:
DispatchQueue.global(qos: .default).ak.async {
}
try DispatchQueue.global(qos: .default).ak.await {
}
When you use these methods and you are doing asynchronous, be careful to do nothing in the main thread, otherwise you risk to enter in a deadlock situation.
The recommended approach to use AwaitKit in your project is using the CocoaPods package manager, as it provides flexible dependency management and dead simple installation.
Install CocoaPods if not already available:
$ [sudo] gem install cocoapods
$ pod setup
Go to the directory of your Xcode project, and Create and Edit your Podfile and add AwaitKit:
$ cd /path/to/MyProject
$ touch Podfile
$ edit Podfile
source 'https://github.com/CocoaPods/Specs.git'
platform :ios, '8.0'
pod 'AwaitKit', '~> 5.2.0'
Install into your project:
$ pod install
If CocoaPods did not find the PromiseKit 6.10
dependency execute this command:
$ pod repo update
Open your project in Xcode from the .xcworkspace file (not the usual project file)
$ open MyProject.xcworkspace
You can use The Swift Package Manager to install AwaitKit
by adding the proper description to your Package.swift
file:
import PackageDescription
let package = Package(
name: "YOUR_PROJECT_NAME",
dependencies: [
.Package(url: "https://github.com/yannickl/AwaitKit.git")
]
)
Note that the Swift Package Manager is still in early design and development, for more information checkout its GitHub Page.
Carthage is a decentralized dependency manager that builds your dependencies and provides you with binary frameworks.
You can install Carthage with Homebrew using the following command:
$ brew update
$ brew install carthage
To integrate AwaitKit into your Xcode project using Carthage, specify it in your Cartfile
:
github "yannickl/AwaitKit" ~> 5.2.0
Run carthage update
to build the framework and drag the built AwaitKit.framework
into your Xcode project.
Download the project and copy the AwaitKit
folder into your project to use it in. Note that you also need to download the PromiseKit v6.7 library and import it to your project.
Contributions are welcomed and encouraged β‘.
Yannick Loriot
Author: Yannickl
Source Code: https://github.com/yannickl/AwaitKit
License: MIT license
1673947920
Many languages, such as Kotlin, Go, JavaScript, Python, Rust, C#, C++ and others, already have coroutines support that makes the async/await pattern implementation possible. This feature is not yet supported in Swift, but this can be improved by a framework without the need to change the language.
Asynchronous programming is usually associated with callbacks. It is quite convenient until there are too many of them and they start nesting. Then it's called a pyramid of doom or even callback hell.
Another problem of asynchronous programming is error handling, because Swift's natural error handling mechanism cannot be used.
There are many other frameworks that make it easy to use asynchronous code, such as Combine, RxSwift, PromiseKit and so on. They use other approaches that have some drawbacks:
The async/await pattern is an alternative that allows an asynchronous, non-blocking function to be structured in a way similar to an ordinary synchronous function.
It is already well-established in other programming languages and is an evolution in asynchronous programming. The implementation of this pattern is possible thanks to coroutines.
Letβs have a look at the example with coroutine inside of which await()
suspends it and resumes when the result is available without blocking the thread.
//executes coroutine on the main thread
DispatchQueue.main.startCoroutine {
//extension that returns CoFuture<(data: Data, response: URLResponse)>
let dataFuture = URLSession.shared.dataTaskFuture(for: imageURL)
//await CoFuture result that suspends coroutine and doesn't block the thread
let data: Data = try dataFuture.await().data
//create UIImage from the data
guard let image = UIImage(data: data) else { return }
//execute heavy task on global queue and await the result without blocking the thread
let thumbnail: UIImage = try DispatchQueue.global().await { image.makeThumbnail() }
//set image in UIImageView on the main thread
self.imageView.image = thumbnail
}
A coroutine is a computation that can be suspended and resumed at a later time without blocking a thread. Coroutines build upon regular functions and can be executed on any scheduler with a possibility to switch among them during execution.
The coroutines API design is as minimalistic as possible. It consists of the CoroutineScheduler
protocol that describes how to schedule coroutines (DispatchQueue
already conforms it), and the Coroutine
structure with utility methods. This API is enough to do amazing things.
The following example shows the usage of await()
inside a coroutine to wrap asynchronous calls.
//execute coroutine on the main thread
DispatchQueue.main.startCoroutine {
//await URLSessionDataTask response without blocking the thread
let (data, response, error) = try Coroutine.await { callback in
URLSession.shared.dataTask(with: url, completionHandler: callback).resume()
}
. . . use response on the main thread . . .
}
Here's how we can conform NSManagedObjectContext
to CoroutineScheduler
for launching coroutines on it.
extension NSManagedObjectContext: CoroutineScheduler {
func scheduleTask(_ task: @escaping () -> Void) {
perform(task)
}
}
//execute coroutine on the main thread
DispatchQueue.main.startCoroutine {
let context: NSManagedObjectContext //context with privateQueueConcurrencyType
let request: NSFetchRequest<NSDictionary> //some complex request
//execute request on the context without blocking the main thread
let result: [NSDictionary] = try context.await { try context.fetch(request) }
}
A future is a read-only holder for a result that will be provided later and the promise is the provider of this result. They represent the eventual completion or failure of an asynchronous operation.
The futures and promises approach itself has become an industry standart. It is a convenient mechanism to synchronize asynchronous code. But together with coroutines, it takes the usage of asynchronous code to the next level and has become a part of the async/await pattern. If coroutines are a skeleton, then futures and promises are its muscles.
Futures and promises are represented by the corresponding CoFuture
class and its CoPromise
subclass.
//wraps some async func with CoFuture
func makeIntFuture() -> CoFuture<Int> {
let promise = CoPromise<Int>()
someAsyncFunc { int in
promise.success(int)
}
return promise
}
It allows to start multiple tasks in parallel and synchronize them later with await()
.
//create CoFuture<Int> that takes 2 sec. from the example above
let future1: CoFuture<Int> = makeIntFuture()
//execute coroutine on the global queue and returns CoFuture<Int> with future result
let future2: CoFuture<Int> = DispatchQueue.global().coroutineFuture {
try Coroutine.delay(.seconds(3)) //some work that takes 3 sec.
return 6
}
//execute coroutine on the main thread
DispatchQueue.main.startCoroutine {
let sum: Int = try future1.await() + future2.await() //will await for 3 sec.
self.label.text = "Sum is \(sum)"
}
It's very easy to transform or compose CoFuture
s into a new one.
let array: [CoFuture<Int>]
//create new CoFuture<Int> with sum of future results
let sum = CoFuture { try array.reduce(0) { try $0 + $1.await() } }
Futures and promises provide a convenient way to transfer a single value between coroutines. Channels provide a way to transfer a stream of values. Conceptually, a channel is similar to a queue that allows to suspend a coroutine on receive if it is empty, or on send if it is full.
This non-blocking primitive is widely used in such languages as Go and Kotlin, and it is another instrument that improves working with coroutines.
To create channels, use the CoChannel
class.
//create a channel with a buffer which can store only one element
let channel = CoChannel<Int>(capacity: 1)
DispatchQueue.global().startCoroutine {
for i in 0..<100 {
//imitate some work
try Coroutine.delay(.seconds(1))
//sends a value to the channel and suspends coroutine if its buffer is full
try channel.awaitSend(i)
}
//close channel when all values are sent
channel.close()
}
DispatchQueue.global().startCoroutine {
//receives values until closed and suspends a coroutine if it's empty
for i in channel.makeIterator() {
print("Receive", i)
}
print("Done")
}
All launched coroutines, CoFuture
s and CoChannel
s, usually do not need to be referenced. They are deinited after their execution. But often there is a need to complete them earlier, when they are no longer needed. For this, CoFuture
and CoChannel
have methods for canceling.
CoScope
makes it easier to manage the life cycle of these objects. It allows you to keep weak references to them and cancel if necessary or on deinit.
You can add coroutines, CoFuture
s, CoChannel
s and other CoCancellable
to CoScope
to cancel them when they are no longer needed or on deinit.
class ViewController: UIViewController {
let scope = CoScope() //will cancel all objects on `cancel()` or deinit
func performSomeWork() {
//create new `CoChannel` and add to `CoScope`
let channel = makeSomeChannel().added(to: scope)
//execute coroutine and add to `CoScope`
DispatchQueue.main.startCoroutine(in: scope) { [weak self] in
for item in channel.makeIterator() {
try self?.performSomeWork(with: item)
}
}
}
func performSomeWork(with item: Item) throws {
//create new `CoFuture` and add to `CoScope`
let future = makeSomeFuture(item).added(to: scope)
let result = try future.await()
. . . do some work using result . . .
}
}
Author: Belozierov
Source Code: https://github.com/belozierov/SwiftCoroutine
License: MIT license
1673126700
Tame Async Code with Battle-tested Promises
fetchUserId().then { id in
print("UserID : \(id)")
}.onError { e in
print("An error occured : \(e)")
}.finally {
print("Everything is Done :)")
}
let userId = try! awaitPromise(fetchUserId())
Because async code is hard to write, hard to read, hard to reason about. A pain to maintain
then is part of freshOS iOS toolset. Try it in an example App! Download Starter Project
By using a then keyword that enables you to write aSync code that reads like an English sentence
Async code is now concise, flexible and maintainable β€οΈ
Promise
/ Future
conceptAsync
/ Await
progress
race
recover
validate
retry
bridgeError
chain
noMatterWhat
...fetchUserId({ id in
fetchUserNameFromId(id, success: { name in
fetchUserFollowStatusFromName(name, success: { isFollowed in
// The three calls in a row succeeded YAY!
reloadList()
}, failure: { error in
// Fetching user ID failed
reloadList()
})
}, failure: { error in
// Fetching user name failed
reloadList()
})
}) { error in
// Fetching user follow status failed
reloadList()
}
πππ#callbackHell
fetchUserId()
.then(fetchUserNameFromId)
.then(fetchUserFollowStatusFromName)
.then(updateFollowStatus)
.onError(showErrorPopup)
.finally(reloadList)
fetchUserId().then { id in
print("UserID : \(id)")
}.onError { e in
print("An error occured : \(e)")
}.finally {
print("Everything is Done :)")
}
If we want this to be maintainable, it should read like an English sentence
We can do this by extracting our blocks into separate functions:
fetchUserId()
.then(printUserID)
.onError(showErrorPopup)
.finally(reloadList)
This is now concise, flexible, maintainable, and it reads like an English sentence <3
Mental sanity saved // #goodbyeCallbackHell
Wondering what fetchUserId() is?
It is a simple function that returns a strongly typed promise :
func fetchUserId() -> Promise<Int> {
return Promise { resolve, reject in
print("fetching user Id ...")
wait { resolve(1234) }
}
}
Here you would typically replace the dummy wait function by your network request <3
As for then
and onError
, you can also call a progress
block for things like uploading an avatar for example.
uploadAvatar().progress { p in
// Here update progressView for example
}
.then(doSomething)
.onError(showErrorPopup)
.finally(doSomething)
Our implementation slightly differs from the original javascript Promises. Indeed, they do not start right away, on purpose. Calling then
, onError
, or finally
will start them automatically.
Calling then
starts a promise if it is not already started. In some cases, we only want to register some code for later. For instance, in the case of JSON to Swift model parsing, we often want to attach parsing blocks to JSON promises, but without starting them.
In order to do that we need to use registerThen
instead. It's the exact same thing as then
without starting the promise right away.
let fetchUsers:Promise<[User]> = fetchUsersJSON().registerThen(parseUsersJSON)
// Here promise is not launched yet \o/
// later...
fetchUsers.then { users in
// YAY
}
Note that onError
and finally
also have their non-starting counterparts : registerOnError
and registerFinally
.
Oftetimes we need to return a rejecting promise as such :
return Promise { _, reject in
reject(anError)
}
This can be written with the following shortcut :
return Promise.reject(error:anError)
With race
, you can send multiple tasks and get the result of the first one coming back :
race(task1, task2, task3).then { work in
// The first result !
}
With .recover
, you can provide a fallback value for a failed Promise.
You can :
.recover(with: 12)
.recover(MyError.defaultError, with: 12)
.recover { e in
if e == x { return 32 }
if e == y { return 143 }
throw MyError.defaultError
}
.recover { e -> Promise<Int> in
// Deal with the error then
return Promise<Int>.resolve(56)
// Or
return Promise<Int>.reject(e)
}
}
.recover(with: Promise<Int>.resolve(56))
Note that in the block version you can also throw your own error \o/
With .validate
, you can break the promise chain with an assertion block.
You can:
For instance checking if a user is allowed to drink alcohol :
fetchUserAge()
.validate { $0 > 18 }
.then { age in
// Offer a drink
}
.validate(withError: MyError.defaultError, { $0 > 18 })`
A failed validation will retrun a PromiseError.validationFailed
by default.
With retry
, you can restart a failed Promise X number of times.
doSomething()
.retry(10)
.then { v in
// YAY!
}.onError { e in
// Failed 10 times in a row
}
With .bridgeError
, you can intercept a low-level Error and return your own high level error. The classic use-case is when you receive an api error and you bridge it to your own domain error.
You can:
.bridgeError(to: MyError.defaultError)
.bridgeError(SomeError, to: MyError.defaultError)
With .whenAll
, you can combine multiple calls and get all the results when all the promises are fulfilled :
whenAll(fetchUsersA(),fetchUsersB(), fetchUsersC()).then { allUsers in
// All the promises came back
}
With chain
, you can add behaviours without changing the chain of Promises.
A common use-case is for adding Analytics tracking like so:
extension Photo {
public func post() -> Async<Photo> {
return api.post(self).chain { _ in
Tracker.trackEvent(.postPicture)
}
}
}
With noMatterWhat
you can add code to be executed in the middle of a promise chain, no matter what happens.
func fetchNext() -> Promise<[T]> {
isLoading = true
call.params["page"] = page + 1
return call.fetch()
.registerThen(parseResponse)
.resolveOnMainThread()
.noMatterWhat {
self.isLoading = false
}
}
With unwrap
you can transform an optional into a promise :
func fetch(userId: String?) -> Promise<Void> {
return unwrap(userId).then {
network.get("/user/\($0)")
}
}
Unwrap will fail the promise chain with unwrappingFailed
error in case of a nil value :)
AsyncTask
and Async<T>
typealisases are provided for those of us who think that Async can be clearer than Promise
. Feel free to replace Promise<Void>
by AsyncTask
and Promise<T>
by Async<T>
wherever needed.
This is purely for the eyes :)
awaitPromise
waits for a promise to complete synchronously and yields the result :
let photos = try! awaitPromise(getPhotos())
async
takes a block and wraps it in a background Promise.
async {
let photos = try awaitPromise(getPhotos())
}
Notice how we don't need the !
anymore because async
will catch the errors.
Together, async
/awaitPromise
enable us to write asynchronous code in a synchronous manner :
async {
let userId = try awaitPromise(fetchUserId())
let userName = try awaitPromise(fetchUserNameFromId(userId))
let isFollowed = try awaitPromise(fetchUserFollowStatusFromName(userName))
return isFollowed
}.then { isFollowed in
print(isFollowed)
}.onError { e in
// handle errors
}
Await comes with ..
shorthand operator. The ..?
will fallback to a nil value instead of throwing.
let userId = try awaitPromise(fetchUserId())
Can be written like this:
let userId = try ..fetchUserId()
The Swift Package Manager (SPM) is now the official way to install Then
. The other package managers are now deprecated as of 5.1.3
and won't be supported in future versions.
Xcode
> File
> Swift Packages
> Add Package Dependency...
> Paste
https://github.com/freshOS/Then
target 'MyApp'
pod 'thenPromise'
use_frameworks!
github "freshOS/then"
S4cha, Max Konovalov, YannickDot, Damien, piterlouis
Reason - Example - Documentation - Installation
Author: freshOS
Source Code: https://github.com/freshOS/Then
License: MIT license
1672369860
Async PostgreSQL client built with Amp.
This package can be installed as a Composer dependency.
composer require amphp/postgres
Note: pecl-ev is not compatible with ext-pgsql. If you wish to use pecl-ev for the event loop backend, you must use pecl-pq.
Prepared statements and parameterized queries support named placeholders, as well as ?
and standard numeric (i.e. $1
) placeholders.
More examples can be found in the examples
directory.
use Amp\Postgres;
use Amp\Postgres\ConnectionConfig;
use Amp\Sql\Statement;
Amp\Loop::run(function () {
$config = ConnectionConfig::fromString("host=localhost user=postgres db=test");
/** @var Postgres\Pool $pool */
$pool = Postgres\pool($config);
/** @var Statement $statement */
$statement = yield $pool->prepare("SELECT * FROM test WHERE id = :id");
/** @var Postgres\ResultSet $result */
$result = yield $statement->execute(['id' => 1337]);
while (yield $result->advance()) {
$row = $result->getCurrent();
// $row is an array (map) of column values. e.g.: $row['column_name']
}
});
amphp/postgres
follows the semver semantic versioning specification like all other amphp
packages.
If you discover any security related issues, please email contact@amphp.org
instead of using the issue tracker.
Author: amphp
Source Code: https://github.com/amphp/postgres
License: MIT license
1672362000
amphp/file
allows non-blocking access to the filesystem for Amp.
This package can be installed as a Composer dependency.
composer require amphp/file
Extensions allow using threading in the background instead of using multiple processes.
amphp/file
works out of the box without any PHP extensions. It uses multi-processing by default, but also comes with a blocking driver that just uses PHP's blocking functions in the current process.
amphp/file
follows the semver semantic versioning specification like all other amphp
packages.
If you discover any security related issues, please email me@kelunik.com
instead of using the issue tracker.
Author: amphp
Source Code: https://github.com/amphp/file
License: MIT license
1672353960
This library provides a RequestHandler
to easily handle Websocket connections using amphp/http-server
.
This package can be installed as a Composer dependency.
composer require amphp/websocket-server
The documentation for this library is currently a work in progress. Pull requests to improve the documentation are always welcome!
<?php
// Note that this example requires:
// amphp/http-server-router
// amphp/http-server-static-content
// amphp/log
use Amp\Http\Server\HttpServer;
use Amp\Http\Server\Request;
use Amp\Http\Server\Response;
use Amp\Http\Server\Router;
use Amp\Http\Server\StaticContent\DocumentRoot;
use Amp\Log\ConsoleFormatter;
use Amp\Log\StreamHandler;
use Amp\Loop;
use Amp\Promise;
use Amp\Socket\Server;
use Amp\Success;
use Amp\Websocket\Client;
use Amp\Websocket\Message;
use Amp\Websocket\Server\ClientHandler;
use Amp\Websocket\Server\Gateway;
use Amp\Websocket\Server\Websocket;
use Monolog\Logger;
use function Amp\ByteStream\getStdout;
use function Amp\call;
require __DIR__ . '/vendor/autoload.php';
$websocket = new Websocket(new class implements ClientHandler {
private const ALLOWED_ORIGINS = [
'http://localhost:1337',
'http://127.0.0.1:1337',
'http://[::1]:1337'
];
public function handleHandshake(Gateway $gateway, Request $request, Response $response): Promise
{
if (!\in_array($request->getHeader('origin'), self::ALLOWED_ORIGINS, true)) {
return $gateway->getErrorHandler()->handleError(403);
}
return new Success($response);
}
public function handleClient(Gateway $gateway, Client $client, Request $request, Response $response): Promise
{
return call(function () use ($gateway, $client): \Generator {
while ($message = yield $client->receive()) {
\assert($message instanceof Message);
$gateway->broadcast(\sprintf(
'%d: %s',
$client->getId(),
yield $message->buffer()
));
}
});
}
});
Loop::run(function () use ($websocket): Promise {
$sockets = [
Server::listen('127.0.0.1:1337'),
Server::listen('[::1]:1337'),
];
$router = new Router;
$router->addRoute('GET', '/broadcast', $websocket);
$router->setFallback(new DocumentRoot(__DIR__ . '/public'));
$logHandler = new StreamHandler(getStdout());
$logHandler->setFormatter(new ConsoleFormatter);
$logger = new Logger('server');
$logger->pushHandler($logHandler);
$server = new HttpServer($sockets, $router, $logger);
return $server->start();
});
Author: amphp
Source Code: https://github.com/amphp/websocket-server
License: MIT license
1672350180
Catch-all SMTP server for local debugging purposes.
This SMTP server catches all e-mail being sent through it and provides an interface to inspect the e-mails.
Note: this SMTP server is meant to be run locally. As such several security considerations (e.g. SMTP transaction delays) have been omitted by design. Never run this project as a public service.
This project is currently working towards a first stable release version.
The master branch of this project will always be in a functioning state and will always point to the last release.
All active development should be based off the v0.4.0 branch.
AUTH
command)composer create-project peehaa/mailgrab
Download the latest phar file from the releases page.
./bin/mailgrab
will start MailGrab using the default configuration:
See ./bin/mailgrab --help
for more configuration options
Once the MailGrab server is started you can point your browser to http://localhost:9000
to access the webinterface.
If you send a mail to the server over port 9025 it will automatically be displayed in the webinterface.
There are example mail scripts available under ./examples
(e.g. php examples/full-test.php
) which you can run to test the functionality.
/path/to/mailgrab.phar
will start MailGrab using the default configuration:
See /path/to/mailgrab.phar --help
for more configuration options
To get started run npm install
.
An NPM build script is provided and can be used by running npm run build
in the project root.
Currently all active development has to be based off the v0.4.0 branch.
If you want to build a phar you can run the build script located at ./bin/build
which will create a new build in the ./build
directory.
Author: PeeHaa
Source Code: https://github.com/PeeHaa/mailgrab
License: MIT license
1672342380
kelunik/acme
is a non-blocking implementation of the ACME protocol based on the amp
concurrency framework.
If you're looking for a PHP client, have a look at
kelunik/acme-client
which is based on this library.
Required PHP Version
Installation
composer require kelunik/acme
This package follows semantic versioning.
Usage
You should be familiar with promises and amphp/amp
. You can always use Amp\Promise\wait
to use this async library in synchronous code.
Author: kelunik
Source Code: https://github.com/kelunik/acme
License: MIT license
1672334220
AMPHP is a collection of event-driven libraries for PHP designed with fibers and concurrency in mind. amphp/sync
specifically provides synchronization primitives such as locks and semaphores for asynchronous and concurrent programming.
This package can be installed as a Composer dependency.
composer require amphp/sync
The weak link when managing concurrency is humans; so amphp/sync
provides abstractions to hide some complexity.
Mutual exclusion can be achieved using Amp\Sync\synchronized()
and any Mutex
implementation, or by manually using the Mutex
instance to acquire a Lock
.
As long as the resulting Lock
object isn't released using Lock::release()
or by being garbage collected, the holder of the lock can exclusively run some code as long as all other parties running the same code also acquire a lock before doing so.
function writeExclusively(Amp\Sync\Mutex $mutex, string $filePath, string $data) {
$lock = $mutex->acquire();
try {
Amp\File\write($filePath, $data);
} finally {
$lock->release();
}
}
function writeExclusively(Amp\Sync\Mutex $mutex, string $filePath, string $data) {
Amp\Sync\synchronized($mutex, fn () => Amp\File\write($filePath, $data));
}
Semaphores are another synchronization primitive in addition to mutual exclusion.
Instead of providing exclusive access to a single party, they provide access to a limited set of N parties at the same time. This makes them great to control concurrency, e.g. limiting an HTTP client to X concurrent requests, so the HTTP server doesn't get overwhelmed.
Similar to Mutex
, Lock
instances can be acquired using Semaphore::acquire()
. Please refer to the Mutex
documentation for additional usage documentation, as they're basically equivalent except for the fact that Mutex
is always a Semaphore
with a count of exactly one party.
In many cases you can use amphp/pipeline
instead of directly using a Semaphore
.
Given you have a list of URLs you want to crawl, let's discuss a few possible approaches. For simplicity, we will assume a fetch
function already exists, which takes a URL and returns the HTTP status code (which is everything we want to know for these examples).
Simple loop using non-blocking I/O, but no concurrency while fetching the individual URLs; starts the second request as soon as the first completed.
$urls = [...];
$results = [];
foreach ($urls as $url) {
$results[$url] = fetch($url);
}
var_dump($results);
Almost the same loop, but awaiting all operations at once; starts all requests immediately. Might not be feasible with too many URLs.
$urls = [...];
$results = [];
foreach ($urls as $url) {
$results[$url] = Amp\async(fetch(...), $url);
}
$results = Amp\Future\await($results);
var_dump($results);
Splitting the jobs into chunks of ten; all requests within a chunk are made concurrently, but each chunk sequentially, so the timing for each chunk depends on the slowest response; starts the eleventh request as soon as the first ten requests completed.
$urls = [...];
$results = [];
foreach (\array_chunk($urls, 10) as $chunk) {
$futures = [];
foreach ($chunk as $url) {
$futures[$url] = Amp\async(fetch(...), $url);
}
$results = \array_merge($results, Amp\Future\await($futures));
}
var_dump($results);
TODO: Link to example of amphp/pipeline
amphp/sync
follows the semver semantic versioning specification like all other amphp
packages.
If you discover any security related issues, please email me@kelunik.com
instead of using the issue tracker.
Author: amphp
Source Code: https://github.com/amphp/sync
License: MIT license
1672330380
amphp/dns
provides asynchronous DNS resolution for PHP based on Amp.
composer require amphp/dns
<?php
require __DIR__ . '/examples/_bootstrap.php';
use Amp\Dns;
use Amp\Loop;
Loop::run(function () {
$githubIpv4 = yield Dns\resolve("github.com", Dns\Record::A);
pretty_print_records("github.com", $githubIpv4);
$googleIpv4 = Amp\Dns\resolve("google.com", Dns\Record::A);
$googleIpv6 = Amp\Dns\resolve("google.com", Dns\Record::AAAA);
$firstGoogleResult = yield Amp\Promise\first([$googleIpv4, $googleIpv6]);
pretty_print_records("google.com", $firstGoogleResult);
$combinedGoogleResult = yield Amp\Dns\resolve("google.com");
pretty_print_records("google.com", $combinedGoogleResult);
$googleMx = yield Amp\Dns\query("google.com", Amp\Dns\Record::MX);
pretty_print_records("google.com", $googleMx);
});
Author: Amphp
Source Code: https://github.com/amphp/dns
License: MIT license