All Projects → irrustible → async-oneshot

irrustible / async-oneshot

Licence: MPL-2.0 License
A fast, small, full-featured, no-std compatible oneshot channel

Programming Languages

rust
11053 projects

Projects that are alternatives of or similar to async-oneshot

swift-futures
Demand-driven asynchronous programming in Swift
Stars: ✭ 32 (-41.82%)
Mutual labels:  concurrency, channels, futures
mux-stream
(De)multiplex asynchronous streams
Stars: ✭ 34 (-38.18%)
Mutual labels:  concurrency, async-await, futures
swoole-futures
⏳ Futures, Streams & Async/Await for PHP's Swoole asynchronous run-time.
Stars: ✭ 100 (+81.82%)
Mutual labels:  concurrency, async-await, futures
awesome-dotnet-async
A curated list of awesome articles and resources to learning and practicing about async, threading, and channels in .Net platform. 😉
Stars: ✭ 84 (+52.73%)
Mutual labels:  channels, async-await
java-red
Effective Concurrency Modules for Java
Stars: ✭ 25 (-54.55%)
Mutual labels:  concurrency, futures
futureproof
Bulletproof concurrent.futures
Stars: ✭ 36 (-34.55%)
Mutual labels:  concurrency, futures
Ea Async
EA Async implements async-await methods in the JVM.
Stars: ✭ 1,085 (+1872.73%)
Mutual labels:  concurrency, async-await
await-lock
Mutex locks for async functions
Stars: ✭ 66 (+20%)
Mutual labels:  concurrency, async-await
conquerant
lightweight async/await for Clojure
Stars: ✭ 31 (-43.64%)
Mutual labels:  concurrency, async-await
pygolang
Go-like features for Python and Cython. (mirror of https://lab.nexedi.com/kirr/pygolang)
Stars: ✭ 37 (-32.73%)
Mutual labels:  concurrency, channels
async-enumerable-dotnet
Experimental operators for C# 8 IAsyncEnumerables
Stars: ✭ 32 (-41.82%)
Mutual labels:  concurrency, async-await
Smol
A small and fast async runtime for Rust
Stars: ✭ 2,206 (+3910.91%)
Mutual labels:  concurrency, futures
Brightfutures
Write great asynchronous code in Swift using futures and promises
Stars: ✭ 1,890 (+3336.36%)
Mutual labels:  concurrency, futures
python3-concurrency
Python3爬虫系列的理论验证,首先研究I/O模型,分别用Python实现了blocking I/O、nonblocking I/O、I/O multiplexing各模型下的TCP服务端和客户端。然后,研究同步I/O操作(依序下载、多进程并发、多线程并发)和异步I/O(asyncio)之间的效率差别
Stars: ✭ 49 (-10.91%)
Mutual labels:  concurrency, futures
Go Concurrency
This repos has lots of Go concurrency, goroutine and channel usage and best practice examples
Stars: ✭ 84 (+52.73%)
Mutual labels:  concurrency, channels
Async-Channel
Python async multi-task communication library. Used by OctoBot project.
Stars: ✭ 13 (-76.36%)
Mutual labels:  concurrency, channels
soabase-stages
A tiny library that makes staged/pipelined CompletableFutures much easier to create and manage
Stars: ✭ 23 (-58.18%)
Mutual labels:  concurrency, futures
P Map
Map over promises concurrently
Stars: ✭ 639 (+1061.82%)
Mutual labels:  concurrency, async-await
Asyncio
asyncio historical repository
Stars: ✭ 952 (+1630.91%)
Mutual labels:  concurrency, async-await
Shift
Light-weight EventKit wrapper.
Stars: ✭ 31 (-43.64%)
Mutual labels:  concurrency, async-await

async-oneshot

License Package Documentation

A fast, small, full-featured, async-aware oneshot channel.

Features:

  • Blazing fast! See Performance section below.
  • Tiny code, only one dependency and a lightning quick build.
  • Complete no_std support (with alloc for Arc).
  • Unique feature: sender may wait for a receiver to be waiting.

Usage

#[test]
fn success_one_thread() {
    let (s,r) = oneshot::<bool>();
    assert_eq!((), s.send(true).unwrap());
    assert_eq!(Ok(true), future::block_on(r));
}

Performance

async-oneshot comes with a benchmark suite which you can run with cargo bench.

All benches are single-threaded and take double digit nanoseconds on my machine. async benches use futures_lite::future::block_on as an executor.

Numbers from my machine

Here are benchmark numbers from my primary machine, a Ryzen 9 3900X running alpine linux 3.12 that I attempted to peg at maximum cpu:

create_destroy          time:   [51.596 ns 51.710 ns 51.835 ns]
send/success            time:   [13.080 ns 13.237 ns 13.388 ns]
send/closed             time:   [25.304 ns 25.565 ns 25.839 ns]
try_recv/success        time:   [26.136 ns 26.246 ns 26.335 ns]
try_recv/empty          time:   [10.764 ns 11.161 ns 11.539 ns]
try_recv/closed         time:   [27.048 ns 27.159 ns 27.248 ns]
async.recv/success      time:   [30.532 ns 30.774 ns 31.011 ns]
async.recv/closed       time:   [28.112 ns 28.208 ns 28.287 ns]
async.wait/success      time:   [56.449 ns 56.603 ns 56.737 ns]
async.wait/closed       time:   [34.014 ns 34.154 ns 34.294 ns]

In short, we are very fast. Close to optimal, I think.

Compared to other libraries

The oneshot channel in futures isn't very fast by comparison.

Tokio put up an excellent fight and made us work hard to improve. In general I'd say we're slightly faster overall, but it's incredibly tight.

Note on safety

This crate uses UnsafeCell and manually synchronises with atomic bitwise ops for performance. We believe it is correct, but we would welcome more eyes on it.

See Also

Note on benchmarking

The benchmarks are synthetic and a bit of fun.

Changelog

v0.5.0

Breaking changes:

  • Make Sender.send() only take a mut ref instead of move.

v0.4.2

Improvements:

  • Added some tests to cover repeated fix released in last version.
  • Inline more aggressively for some nice benchmark boosts.

v0.4.1

Fixes:

  • Remove some overzealous debug_asserts that caused crashes in development in case of repeated waking. Thanks @nazar-pc!

Improvements:

  • Better benchmarks, based on criterion.

v0.4.0

Breaking changes:

  • Sender.wait()'s function signature has changed to be a non-async fn returning an impl Future. This reduces binary size, runtime and possibly memory usage too. Thanks @zserik!

Fixes:

  • Race condition where the sender closes in a narrow window during receiver poll and doesn't wake the Receiver. Thanks @zserik!

Improvements:

  • Static assertions. Thanks @zserik!

v0.3.3

Improvements:

  • Update futures-micro and improve the tests

v0.3.2

Fixes:

  • Segfault when dropping receiver. Caused by a typo, d'oh! Thanks @boardwalk!

v0.3.1

Improvements:

  • Remove redundant use of ManuallyDrop with UnsafeCell. Thanks @cynecx!

v0.3.0

Improvements:

  • Rewrote, benchmarked and optimised.

v0.2.0

  • First real release.

Copyright and License

Copyright (c) 2020 James Laver, async-oneshot contributors.

This Source Code Form is subject to the terms of the Mozilla Public License, v. 2.0. If a copy of the MPL was not distributed with this file, You can obtain one at http://mozilla.org/MPL/2.0/.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].