skip to Main Content

I’m trying to understand if the node-cache package uses locks for the cache object and can’t find anything.

I tried to look at the source code and it doesn’t look like it, but this answer suggests otherwise with the quote:

So there is Redis and node-cache for memory locks.

This cache is used in a CRUD server and I want to make sure that GET/UPDATE requests will not create a race condition on the data.

2

Answers


  1. I don’t see any evidence of locking in the code.

    If two requests for the same key which is not in the cache are made one after the other, then it will launch two separate fetch() operations and whichever request comes back last is the one that will remain in the cache. This is probably not normally a problem, but an improved implementation could make only one request for that same key and have the second request just wait for the first request to provide the value that was already in flight.

    Since the cache itself is all in-memory, all access to the cache is synchronous and thus regulated by Javascript’s single threaded nature. So, the only place concurrency issues could affect things in the cache code itself are when they launch an asynchronous fetch() operation.

    There are, of course, race conditions waiting to happen in how one uses the code that accesses the data just like there are with a database interface so the calling code has to be smart about how it uses the interface to avoid creating race conditions because of how it calls things.

    Login or Signup to reply.
  2. Unfortunately no, you can write a unit test to confirm it.

    I have written a library to fix that and also added read through method to easy the code usage:
    https://github.com/KhanhPham2411/node-cache-async-lock

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search