skip to Main Content

I am using Redis as a centralized in-memory data storage between different distributions of applications.

When all distributions of my application operate on the same key, I am afraid that a concurrency situation could occur. Take this example:

THIS_DISTRIBUTION_NAME = "Alpha1"

alive_distributions = redis.hmget('alive_distributions') # {"Charlie1": 1623234874}

alive_distributions[THIS_DISTRIBUTION_NAME] = int(time.time()) # {"Charlie1": 1623234874,"Alpha1": 1623234875}

redis.hmset('alive_machines', alive_distributions) 

If Charlie1 gets updated in-between Alpha1‘s hmget and hmset operation, it would result in data inconsistency.

How can I perform a safe, transaction-like update of a dictionary without concurrency?

2

Answers


  1. Redis supports transactions and I would specifically look into the check-and-set section of the documentation. Optimistic locking it’s what you need.

    Update #1


    Assuming redis-py as your redis client you could do something like this:

    pipe = redis.pipeline()
    pipe.watch('some_key') # watch key 'some_key'
    value = pipe.get('some_key')
    pipe.multi() # start transaction
    value = 'some_value'
    pipe.set('some_key', value) # update 'some_key' with new value
    pipe.execute() # will fail if some other process wrote to 'some_key' in the meantime
    
    
    Login or Signup to reply.
  2. The operation you are interested in may be wrapped in a MULTI/EXEC transaction in order to make sure both fields are updated or none.
    The command itslef that you need is the zadd command. In fact with the zadd comand you can preset values to be indexed (updated) automatically after a period of time.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search