skip to Main Content

I have a dozen of REDIS Keys of the type SET, say

PUBSUB_USER_SET-1-1668985588478915880,
PUBSUB_USER_SET-2-1668985588478915880,
PUBSUB_USER_SET-3-1668988644477632747,
.
.
.
.
PUBSUB_USER_SET-10-1668983464477632083

The set contains a userId and the problem statement is to check if the user is present in any of the set or not

The solution I tried is to get all the keys and append with a delimiter (, comma) and pass it as an argument to lua script wherein with gmatch operator I split the keys and run sismember operation until there is a hit.

local vals = KEYS[1]
        for match in (vals..","):gmatch("(.-)"..",") do
            local exist = redis.call('sismember', match, KEYS[2])
            if (exist == 1) then
                return 1
            end
        end
        return 0

Now as and when the number of keys grows to PUBSUB_USER_SET-20 or PUBSUB_USER_SET-30 I see an increase in latency and in throughput.

Is this the better way to do or Is it better to batch LUA scripts where in instead of passing 30keys as arguments I pass in batches of 10keys and return as soon as the user is present or is there any better way to do this?

2

Answers


  1. You can use redis pipeline with batching(10 keys per iteration) to improve the performance

    Login or Signup to reply.
  2. I would propose a different solution instead of storing keys randomly in a set. You should store keys in one set and you should query that set to check whether a key is there or not.

    Lets say we’ve N sets numbered s-0,s-1,s-2,…,s-19
    You should put your keys in one of these sets based on their hash key, which means you need to query only one set instead of checking all these sets. You can use any hashing algorithm.

    To make it further interesting you can try consistent hashing.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search