skip to Main Content

I have a question about handling promises in resolve functions for a GraphQL client. Traditionally, resolvers would be implemented on the server, but I am wrapping a REST API on the client.

Background and Motivation

Given resolvers like:

const resolvers = {
  Query: {
    posts: (obj, args, context) => {
      return fetch('/posts').then(res => res.json());
    }
  },
  Post: {
    author: (obj, args, _, context) => {
      return fetch(`/users/${obj.userId}`)
        .then(res => res.json());
        .then(data => cache.users[data.id] = data)
    }
  }
};

If I run the query:

posts {
  author {
    firstName
  }
}

and the Query.posts() /posts API returns four post objects:

[
  {
    "id": 1,
    "body": "It's a nice prototyping tool",
    "user_id": 1
  },
  {
    "id": 2,
    "body": "I wonder if he used logo?",
    "user_id": 2
  },
  {
    "id": 3,
   "body": "Is it even worth arguing?",
   "user_id": 1
  },
  {
    "id": 4,
    "body": "Is there a form above all forms? I think so.",
    "user_id": 1
  }
]

the Post.author() resolver will get called four times to resolve the author field.

grapqhl-js has a very nice feature where each of the promises returned from the Post.author() resolver will execute in parallel.

I’ve further been able to eliminate re-fetching author’s with the same userId using facebook’s dataloader library. BUT, I’d like to use a custom cache instead of dataloader.

The Question

Is there a way to prevent the Post.author() resolver from executing in parallel? Inside the Post.author() resolver, I would like to fetch authors one at a time, checking my cache in between to prevent duplicate http requests.

But, right now the promises returned from Post.author() are queued and executed at once, so I cannot check the cache before each request.

Thank you for any tips!

2

Answers


  1. I definitely recommend looking at DataLoader as it’s designed to solve exactly this problem. If you don’t use it directly, at least you can read its implementation (which is not that many lines) and borrow the techniques atop your custom cache.

    GraphQL and the graphql.js libraries themselves are not concerned with loading data – they leave that up to you via resolver functions. Graphql.js is just calling these resolver functions as eagerly as it can to provide for the fastest overall execution of your query. You can absolutely decide to return Promises which resolve sequentially (which I wouldn’t recommend), or—as DataLoader implements—deduplicate with memoization (which is what you want for solving this).

    For example:

    const resolvers = {
      Post: {
        author: (obj, args, _, context) => {
          return fetchAuthor(obj.userId)
        }
      }
    };
    
    // Very simple memoization
    var authorPromises = {};
    function fetchAuthor(id) {
      var author = authorPromises[id];
      if (!author) {
        author = fetch(`/users/${id}`)
          .then(res => res.json());
          .then(data => cache.users[data.id] = data);
        authorPromises[id] = author;
      }
      return author;   
    }
    
    Login or Signup to reply.
  2. Just for some people who use dataSource for REST api stuff along with dataLoader(in this case, it doesn’t really help as it’s a single request). Here is a simple caching solution/example.

    export class RetrievePostAPI extends RESTDataSource {
      constructor() {
        super()
        this.baseURL = 'http://localhost:3000/'
      }
      postLoader = new DataLoader(async ids => {
        return await Promise.all(
          ids.map(async id => {
            if (cache.keys().includes(id)) {
              return cache.get(id)
            } else {
              const postPromise = new Promise((resolve, reject) => {
                resolve(this.get(`posts/${id}`))
                reject('Post Promise Error!')
              })
              cache.put(id, postPromise, 1000 * 60)
              return postPromise
            }
          })
        )
      })
    
      async getPost(id) {
        return this.postLoader.load(id)
      }
    }
    

    Note: here I use memory-cache for caching mechanism.
    Hope this helps.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search