skip to Main Content

I know that generators paired with iterators can be used to write async code, but what’s the benefit of using it? I’ve heard it offers more fine grained control than promises, but so far I haven’t seen a case of that… I also believe I’m right in saying that while generators where considered for the implementation of async/await, all JS engines use promises to implement it. Here’s an example of using generators to handle async operations:

function fetchData() {
  return new Promise((resolve, reject) => {
    // Generator function
    function* fetchGenerator() {
      try {
        console.log("Fetching data...");
        const data1 = yield fetch('https://api.example.com/data1');
        console.log("Data 1 fetched");
        const data2 = yield fetch('https://api.example.com/data2');
        console.log("Data 2 fetched");
        resolve({ data1, data2 });
      } catch (error) {
        reject(error);
      }
    }
    
    // Create generator iterator
    const generator = fetchGenerator();
    
    // Function to handle generator iteration
    function handleAsync(generator, yieldedValue) {
      const { value, done } = generator.next(yieldedValue);
      if (done) {
        resolve(value);
      } else {
        Promise.resolve(value)
          .then(
            result => handleAsync(generator, result),
            error => generator.throw(error)
          );
      }
    }
    
    handleAsync(generator);
  });
}

fetchData().then(result => {
  console.log("All data fetched:", result);
}).catch(error => {
  console.error(“Fetching data:", error);
});

2

Answers


  1. I usually use generators to create iterators to iterate something. Some scenario would be to iterate some paged data, I find it very convenient, if you can write simpler / more convenient code without generators, you better do. So it depends.

    Basically generators in async code means usually async generators + for await.

    .as-console-wrapper{max-height: 100% !important}
    <script type="module">
    
    const fetch = url => new Promise(r => console.log('fetching', url) ?? setTimeout(() => r(Math.random()), 500));
    
    async function* fetchPages(pages) {
      for(let i = 1; i <= pages; i++){
        const num = await fetch('https://api.example.com/data' + i);
        yield {num};
      }
    }
    
    for await(const data of fetchPages(10)){
      console.log('data', JSON.stringify(data));
      // do something with data
      if(data.num < await .3){ // some async code inside for await
        console.log('found condition, breaking');
        break;
      }
    }
    
    </script>

    An interesting scenario is to wrap Promice.race() into an async generator for more convenient race handling:

    <script type="module">
    
    const fetch = url => new Promise(r => console.log('fetching', url) ?? setTimeout(() => r(Math.random()), Math.random() * 1000));
    
    async function* fetchPages(pages) {
      const promises = Array.from({length:pages},  (_, i) => {
        const out = fetch('https://api.example.com/data' + (i + 1));
        out.then(() => promises.splice(promises.indexOf(out), 1));
        return out;
      });
      while(promises.length) yield {num: await Promise.race(promises)};
    }
    
    for await(const data of fetchPages(10)){
      console.log('data', JSON.stringify(data));
      // do something with data
      if(data.num < await .3){ // some async code inside for await
        console.log('found condition, breaking');
        break;
      }
    }
    
    </script>

    There’s also new Array.fromAsync to create arrays for example from async generators.

    .as-console-wrapper{max-height: 100% !important}
    <script type="module">
    
    const fetch = url => new Promise(r => console.log('fetching', url) ?? setTimeout(() => r(Math.random()), 500));
    
    async function* fetchPages(pages) {
      for(let i = 1; i <= pages; i++){
        const num = await fetch('https://api.example.com/data' + i);
        yield {num};
      }
    }
    
    console.log(await Array.fromAsync(fetchPages(3)));
    
    </script>
    Login or Signup to reply.
  2. I used it once in a personal project to handle pagination:

    GitHub

        async function* getSubredditIterator<TGetSubredditArgs extends GetSubredditArgs>(
            args: TGetSubredditArgs
        ): AsyncIterator<GetSubredditResponse<TGetSubredditArgs>> {
            let after: string | null = null
    
            while (true) {
                const res: GetSubredditResponse<TGetSubredditArgs> = await getSubreddit({
                    after,
                    ...args,
                })
    
                after = res.data.after
    
                yield res
            }
        }
    
    

    Consumer:

    const memeSubredditIterator = reddit.getSubredditIterator({
        name: "meme",
        sortMethod: SortingMethod.New,
        limit: 5,
    })
    
    const memeResults0To4 = await memeSubredditIterator.next()
    const memeResults5To9 = await memeSubredditIterator.next()
    

    In my case, I would say that the benefit was not having to handle the pagination by the consumer.

    Otherwise, he would add to save the "after" himself between each request.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search