skip to Main Content

We had a leak in code which resulted in few keys not getting deleted in redis elasticache. We noticed it only when the number became > 3 million. We have fixed code around it, however we need to fix the redis as well. Now we cant flush all as it will delete all the keys. We only want to delete keys, lets say older than 15 days. I found few commands online like however how can I iterate over 3 million records without getting the system stuck. Please help.

Thank you in advance.

object idletime
del record

2

Answers


  1. Chosen as BEST ANSWER

    Anyone looking for similar problem as mine, the below code worked for me. It's not very efficient but does the work for me.

    Iterable<String> iter = redissonClient.getKeys().getKeysByPattern(patternNew, scanLimit);
                delList = new ArrayList<>();
                for (String key : iter) {
                    RBucket<String> bucket = redissonClient.getBucket(key);
                    idletime = bucket.getIdleTime();
                    if (idletime > idletimeout) {
                        delList.add(key);
                    }
                }
                if (!delList.isEmpty()) {
                    recordsDeleted += delList.size();
                    count = redissonClient.getKeys().deleteAsync(delList.stream().toArray(String[]::new));
                }
            }
    

  2. You can configure the maxmemory and set an eviction policy which will delete the keys based on the policy once maxmemory is hit.

    https://docs.aws.amazon.com/whitepapers/latest/database-caching-strategies-using-redis/evictions.html

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search