All Projects → MattiasBuelens → Remote Web Streams

MattiasBuelens / Remote Web Streams

Licence: mit
Web streams that work across web workers and iframes.

Programming Languages

typescript
32286 projects

Projects that are alternatives of or similar to Remote Web Streams

streamplify
Java 8 combinatorics-related streams and other utilities
Stars: ✭ 40 (+53.85%)
Mutual labels:  stream, streams
wasm-streams
Bridging between web streams and Rust streams using WebAssembly
Stars: ✭ 61 (+134.62%)
Mutual labels:  stream, streams
TweakIt-Desktop
An Android Debugging Application
Stars: ✭ 33 (+26.92%)
Mutual labels:  stream, remote
Stream Splicer
streaming pipeline with a mutable configuration
Stars: ✭ 52 (+100%)
Mutual labels:  stream, streams
Sql Streams
Painless low level jdbc abstraction using the java 8 stream api.
Stars: ✭ 17 (-34.62%)
Mutual labels:  stream, streams
Cypher Stream
Neo4j Cypher queries as Node.js object streams
Stars: ✭ 58 (+123.08%)
Mutual labels:  stream, streams
godsend
A simple and eloquent workflow for streaming messages to micro-services.
Stars: ✭ 15 (-42.31%)
Mutual labels:  stream, streams
Stockroom
🗃 Offload your store management to a worker easily.
Stars: ✭ 1,745 (+6611.54%)
Mutual labels:  web-worker, worker
Post Me
📩 Use web Workers and other Windows through a simple Promise API
Stars: ✭ 398 (+1430.77%)
Mutual labels:  web-worker, worker
Alloy Worker
面向事务的高可用 Web Worker 通信框架
Stars: ✭ 349 (+1242.31%)
Mutual labels:  web-worker, worker
Labeled Stream Splicer
stream splicer with labels
Stars: ✭ 37 (+42.31%)
Mutual labels:  stream, streams
Hamsters.js
100% Vanilla Javascript Multithreading & Parallel Execution Library
Stars: ✭ 517 (+1888.46%)
Mutual labels:  web-worker, worker
Workerize Loader
🏗️ Automatically move a module into a Web Worker (Webpack loader)
Stars: ✭ 2,135 (+8111.54%)
Mutual labels:  web-worker, worker
Multistream
A stream that emits multiple other streams one after another (streams3)
Stars: ✭ 237 (+811.54%)
Mutual labels:  stream, streams
Redux In Worker
Entire Redux in Web Worker
Stars: ✭ 168 (+546.15%)
Mutual labels:  web-worker, worker
go-streams
Stream Collections for Go. Inspired in Java 8 Streams and .NET Linq
Stars: ✭ 127 (+388.46%)
Mutual labels:  stream, streams
web-streams-polyfill
Web Streams, based on the WHATWG spec reference implementation
Stars: ✭ 198 (+661.54%)
Mutual labels:  stream, streams
Greenlet
🦎 Move an async function into its own thread.
Stars: ✭ 4,511 (+17250%)
Mutual labels:  web-worker, worker
Chunk Store Stream
Convert an abstract-chunk-store compliant store into a readable or writable stream
Stars: ✭ 24 (-7.69%)
Mutual labels:  stream, streams
Json Chunks
streamable json encoder
Stars: ✭ 17 (-34.62%)
Mutual labels:  stream

remote-web-streams

Web streams that work across web workers and <iframe>s.

Problem

Suppose you want to process some data that you've downloaded somewhere. The processing is quite CPU-intensive, so you want to do it inside a worker. No problem, the web has you covered with postMessage!

// main.js
(async () => {
  const response = await fetch('./some-data.txt');
  const data = await response.text();
  const worker = new Worker('./worker.js');
  worker.onmessage = (event) => {
    const output = event.data;
    const results = document.getElementById('results');
    results.appendChild(document.createTextNode(output)); // tadaa!
  };
  worker.postMessage(data);
})();

// worker.js
self.onmessage = (event) => {
  const input = event.data;
  const output = process(input); // do the actual work
  self.postMessage(output);
}

All is good: your processing does not block the main thread, so your web page remains responsive. However, it takes quite a long time before the results show up: first all of the data needs to be downloaded, then all that data needs to be processed, and finally everything is shown on the page. Wouldn't it be nice if we could already show something as soon as some of the data has been downloaded and processed?

Normally, you'd tackle this with by reading the input as a stream, piping it through one or more transform streams and finally displaying the results as they come in.

// main.js
(async () => {
  const response = await fetch('./some-data.txt');
  await response.body
    .pipeThrough(new TransformStream({
      transform(chunk, controller) {
        controller.enqueue(process(chunk)); // do the actual work
      }
    }))
    .pipeTo(new WritableStream({
      write(chunk) {
        const results = document.getElementById('results');
        results.appendChild(document.createTextNode(chunk)); // tadaa!
      }
    }));
})();

Now you can see the first results as they come in, but your processing is blocking the main thread again! Can we get the best of both worlds: process data as it comes in, but off the main thread?

Solution

Enter: remote-web-streams. With this libray, you can create pairs of readable and writable streams where you can write chunks to a writable stream inside one context, and read those chunks from a readable stream inside a different context. Functionally, such a pair behaves just like an identity transform stream, and you can use and compose them just like any other stream.

Basic setup

RemoteReadableStream

The basic steps for setting up a pair of linked streams are:

  1. Construct a RemoteReadableStream. This returns two objects:
    • a MessagePort which must be used to construct the linked WritableStream inside the other context
    • a ReadableStream which will read chunks written by the linked WritableStream
// main.js
const { readable, writablePort } = new RemoteWebStreams.RemoteReadableStream();
  1. Transfer the writablePort to the other context, and instantiate the linked WritableStream in that context using fromWritablePort.
// main.js
const worker = new Worker('./worker.js');
worker.postMessage({ writablePort }, [writablePort]);

// worker.js
self.onmessage = (event) => {
  const { writablePort } = event.data;
  const writable = RemoteWebStreams.fromWritablePort(writablePort);
}
  1. Use the streams as usual! Whenever you write something to the writable inside one context, the readable in the other context will receive it.
// worker.js
const writer = writable.getWriter();
writer.write('hello');
writer.write('world');
writer.close();

// main.js
(async () => {
  const reader = readable.getReader();
  console.log(await reader.read()); // { done: false, value: 'hello' }
  console.log(await reader.read()); // { done: false, value: 'world' }
  console.log(await reader.read()); // { done: true, value: undefined }
})();

RemoteWritableStream

You can also create a RemoteWritableStream. This is the complement to RemoteReadableStream:

  • The constructor (in the original context) returns a WritableStream (instead of a readable one).
  • You transfer the readablePort to the other context, and instantiate the linked ReadableStream with fromReadablePort inside that context.
// main.js
const { writable, readablePort } = new RemoteWebStreams.RemoteWritableStream();
worker.postMessage({ readablePort }, [readablePort]);
const writer = writable.getWriter();
// ...

// worker.js
self.onmessage = (event) => {
  const { readablePort } = event.data;
  const writable = RemoteWebStreams.fromReadablePort(readablePort);
  const reader = readable.getReader();
  // ...
}

Examples

Remote transform stream

In the basic setup, we create one pair of streams and transfer one end to the worker. However, it's also possible to set up multiple pairs and transfer them all to a worker.

This opens up interesting possibilities. We can use a RemoteWritableStream to write chunks to a worker, let the worker transform them using one or more TransformStreams, and then read those transformed chunks back on the main thread using a RemoteReadableStream. This allows us to move one or more CPU-intensive TransformStreams off the main thread, and turn them into a "remote transform stream".

To demonstrate these "remote transform streams", we set one up to solve the original problem statement:

  1. Create a RemoteReadableStream and a RemoteWritableStream on the main thread.
  2. Transfer both streams to the worker. Inside the worker, connect the readable to the writable by piping it through one or more TransformStreams.
  3. On the main thread, write data to be transformed into the writable and read transformed data from the readable. Pro-tip: we can use .pipeThrough({ readable, writable }) for this!
// main.js
const { RemoteReadableStream, RemoteWritableStream } = RemoteWebStreams;
(async () => {
  const worker = new Worker('./worker.js');
  // create a stream to send the input to the worker
  const { writable, readablePort } = new RemoteWritableStream();
  // create a stream to receive the output from the worker
  const { readable, writablePort } = new RemoteReadableStream();
  // transfer the other ends to the worker
  worker.postMessage({ readablePort, writablePort }, [readablePort, writablePort]);

  const response = await fetch('./some-data.txt');
  await response.body
    // send the downloaded data to the worker
    // and receive the results back
    .pipeThrough({ readable, writable })
    // show the results as they come in
    .pipeTo(new WritableStream({
      write(chunk) {
        const results = document.getElementById('results');
        results.appendChild(document.createTextNode(chunk)); // tadaa!
      }
    }));
})();

// worker.js
const { fromReadablePort, fromWritablePort } = RemoteWebStreams;
self.onmessage = async (event) => {
  // create the input and output streams from the transferred ports
  const { readablePort, writablePort } = event.data;
  const readable = fromReadablePort(readablePort);
  const writable = fromWritablePort(writablePort);

  // process data
  await readable
    .pipeThrough(new TransformStream({
      transform(chunk, controller) {
        controller.enqueue(process(chunk)); // do the actual work
      }
    }))
    .pipeTo(writable); // send the results back to main thread
};

With this set up, we achieve the desired goals:

  • Data is transformed as soon as it arrives on the main thread.
  • Transformed data is displayed on the web page as soon as it is transformed by the worker.
  • All of the data processing happens inside the worker, so it never blocks the main thread.

The results are shown as fast as possible, and your web page stays snappy. Great success! 🎉

Behind the scenes

The library works its magic by creating a MessageChannel between the WritableStream and the ReadableStream. The writable end sends a message to the readable end whenever a new chunk is written, so the readable end can enqueue it for reading. Similarly, the readable end sends a message to the writable end whenever it needs more data, so the writable end can release any backpressure.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].