Building on the await
syntax from an earlier gist of mine.
const users = ["domenic", "lukehoban", "erights"];
function userUrl(userName) {
return "http://example.com/users/" + encodeURIComponent(userName);
}
// Returns a Promise<UserData>
function^ getUserData(userName) {
return JSON.parse(await doXHR(userUrl(userName)));
}
You can define such functions in pure ES6:
// Returns a Generator<Promise<UserData>>
function* getAllUserDataLazily() {
for (const userName of users) {
yield getUserData(userName);
}
}
This is lazy because yield
suspends execution, so we don't call getUserData("lukehoban")
until the consumer calls .next()
twice. The consumer code could look like
for (const dataPromise of getAllUserDataLazily()) {
console.log(await dataPromise);
}
Maybe we could sugar this to
for (await const data of getAllUserDataLazily()) {
console.log(data);
}
The problem with the "Lazy sequence of promises" example is we lose the ability to use the await
keyword inside the generator, since it's a function*
and not a function^
. So for example if we got the list of users asynchronously, we would have to revert to using promises manually:
// Returns a Promise<Generator<Promise<UserData>>>
function getAllUserDataLazilyForUnknownUsers() {
return getUserNames().then(function* (users) {
for (const userName of users) {
yield getUserData(userName);
}
});
}
What about if we had function^*
to make this work?
// Returns a Promise<Generator<Promise<UserData>>>
function^* getAllUserDataLazilyForUnknownUsers() {
const users = await getUserNames();
for (const userName of users) {
yield getUserData(userName);
}
}
In both cases, the consuming code would look like
for(const userPromise of await getAllUserDataLazilyForUnknownUsers()) {
console.log(await userPromise);
}
What does this do?
// Returns ???
function^* getAllUserDataEagerly() {
for (const userName of users) {
yield await getUserData(userName);
}
}
await getUserData(userName)
should presumably give back a UserData
, not a Promise<UserData>
. So if you yield
that, the type signature should be Generator<UserData>
. But of course the asynchronicity introduced by await
means that you can't actually give it to the consumer the moment they call .next()
. So I guess that means it should be a Promise<Generator<UserData>>
?
This is probably no good, because we need to wait for all iterations of the loop before we can fulfill the promise. So there is no real interaction between the consuming code calling next()
and the producer code resuming. It could be made to work; you'd need to run the entire body, drain the generator, then re-pack it into an un-drained generator, then fulfill the promise with that un-drained generator. Ick.
And I am not sure these semantics even match up with the ones for the previous example.
So this is kind of sucky.
Per some other thoughts I had, I think readable streams can be made to work as having a read()
method that returns a promise for an ArrayBuffer
or undefined
. But, @annevk pointed out that it's annoying to have to call read()
twice at the end of the stream, once to get the last piece of data, and once again to get the undefined
EOF signal. He suggested a { done, value }
structure, so that the end of the stream would be represented by { done: true, value: someData }
. This got me thinking...
What if readable streams were actually just "async iterables", with next()
methods returning Promise<{ done, value }>
instead of the usual { done, value }
? You could maybe consume them with something like
for (await const data of stream) {
console.log(data);
}
Note that this for (await ... of ...)
is different from the potential sugar suggested under "Lazy Sequence of Promises."