Async channels?

The idea is something like this:

  • {reader, writer} = new Channel(bufferSize=Infinity): Create a channel.
    • Infinite buffer size allows any number of entries
    • A buffer size of 1 keeps only the latest value
    • A buffer size of 0 drops values unless there's a pending reader.next() waiter.
  • await writer.send(value): Send a value, return a promise that resolves once it's received. Multiple producers can receive these.
    • Promises make backpressure easier to manage.
  • writer.close(): Close the channel. No need to wait. After all messages are received, the iterator ends.
  • reader is an async iterable iterator yielding each messages sent to it in the order they were sent. All iterator methods can be called concurrently by any producer for sharding (when useful).

This abstraction is extremely helpful when working in a pull-based world, and I've created variants of this multiple times for things ranging from CLIs to servers. And a multi-producer, multi-consumer channel is the most general way to do it (and isn't all that complicated to write in JS).

It also allows for an easier time coordinating concurrency, and integrates well with the structured concurrency paradigm.


I am not proposing a SharedChannel here analogous to shared array buffers, though it would be nice to consider at some point. It's far more involved, and is basically a yak worth of proposals.

We have abstractions like that in endo/packages/stream at master Β· endojs/endo Β· GitHub and associated packages.

Paging @kriskowal who has a theory of composition based on iterators.

1 Like

Indeed, thanks for tapping me. The Channel you propose is very similar to the Pipe in @endo/stream, and is a pair of entangled async iterators backed by async promise queues. It’s a very tidy abstraction that stands on top of the existing AsyncIterator. I could see the language eventually having an AsyncIterator.pipe() => { reader, writer } that provides this facility. Please take a look!

1 Like