skip to Main Content

I was trying to see if there was a way to cache a json response from a fetch async call, possibly using LRU.

I’ve tried using several packages, such as node-cache and lru-cache, but I don’t think they worked because my function is asynchronous.

This is what my fetch function basically looks like:

const jsonFetch = async (url) => {
    try {
        const response = await fetch (url)
        const json = await response.json();
        return json
    }
    catch (error) {
        console.log(error)
    }
}

For example, if I get someone to hit my route 20 times in a minute, I’d like to easily fetch the data and return the response within 0.03 ms instead of 0.3 ms. Currently, it is always using the a URL to fetch the data.

2

Answers


  1. There’s nothing about async functions that will prevent caching results. It’s possible the libraries you’re looking at can’t handle the promises, but here’s a basic proof of concept that might help to get things started:

    let cache = {}
    const jsonFetch = async (url) => {
        if (url in cache) {                    // return cached result if available
            console.log("cache hit")
            return cache[url]
        }
        try {
            const response = await fetch (url)
            const json = response.json();
            cache[url] = json                  // cache response keyed to url
            return json
        }
        catch (error) {
            console.log(error)
        }
    }
    
    jsonFetch("https://jsonplaceholder.typicode.com/todos/1").then((user) => console.log(user.id))
    
    // should be cached -- same url
    setTimeout(() => jsonFetch("https://jsonplaceholder.typicode.com/todos/1").then((user) => console.log(user.id)), 2000)
    
    // not in cache
    setTimeout(() => jsonFetch("https://jsonplaceholder.typicode.com/todos/2").then((user) => console.log(user.id)), 2000)

    You will only get cache hits on requests made after the first request returns a value to cache

    Login or Signup to reply.
  2. This has been here for a while, but I agree with the comment from @sleepy012. If I wanted to avoid parallel calls, the trick should be to cache the promise, not only the value. So something like this should work:

    let cache = {}
    function cacheAsync(loader) {
      return async (url) => {
        if (url in cache) {                    // return cached result if available
            console.log("cache hit")
            return cache[url]
        }
        try {
            const responsePromise = loader(url)
            cache[url] = responsePromise
            return responsePromise
        }
        catch (error) {
            console.log('Error', error.message)
        }
      };
    }
    
    
    function delayedLoader(url) {
      console.log('Loading url: ' + url)
      return new Promise((r) => setTimeout(r, 1000,'Returning ' + url));
    }
    
    const cachedLoader = cacheAsync(delayedLoader);
    
    cachedLoader('url1').then((d) => console.log('First load got: ' + d));
    cachedLoader('url1').then((d) => console.log('Second load got: ' + d));
    cachedLoader('url2').then((d) => console.log('Third load got: ' + d));
    cachedLoader('url2').then((d) => console.log('Fourth load got: ' + d));
    console.log('Waiting for load to complete');
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search