The idea is something like this:
{reader, writer} = new Channel(bufferSize=Infinity)
: Create a channel.- Infinite buffer size allows any number of entries
- A buffer size of 1 keeps only the latest value
- A buffer size of 0 drops values unless there's a pending
reader.next()
waiter.
await writer.send(value)
: Send a value, return a promise that resolves once it's received. Multiple producers can receive these.- Promises make backpressure easier to manage.
writer.close()
: Close the channel. No need to wait. After all messages are received, the iterator ends.reader
is an async iterable iterator yielding each messages sent to it in the order they were sent. All iterator methods can be called concurrently by any producer for sharding (when useful).
This abstraction is extremely helpful when working in a pull-based world, and I've created variants of this multiple times for things ranging from CLIs to servers. And a multi-producer, multi-consumer channel is the most general way to do it (and isn't all that complicated to write in JS).
It also allows for an easier time coordinating concurrency, and integrates well with the structured concurrency paradigm.
I am not proposing a SharedChannel
here analogous to shared array buffers, though it would be nice to consider at some point. It's far more involved, and is basically a yak worth of proposals.