I have this simple recursive routine to scan redis keys:
async function scanRedisKeys(req, res, next) {
const client = redis.createClient({ host: 'localhost', port: 6379 });
const run = (cursor) => {
client.scan(cursor, 'MATCH', '*', 'COUNT', 100, (err, results) => {
if(err){
log.error(err);
return res.status(400).end();
}
const cursor = results[0];
for(const k of results[1]){
res.write(k);
res.write('n');
}
if(cursor !== '0'){
run(cursor);
}
else{
res.status(200).end();
}
});
}
run('0');
}
it is just daisy-chaining requests in series. I can’t seem to envision a way to do this in parallel (using something like async.queue with a concurrency limit), since the cursors might "overlap".
Seems like to me you just take the cursor result from the previous request – there is no meaningful or even non-error-prone way to do these in parallel? Trying to stream to client and want something smoother than big chunks in longer intervals.
2
Answers
If you are looking for a way to achieve better performance, note that Redis is mostly a single-threaded server from the point of view of command execution. Even if you parallel
scan
requests among multiple clients, it will not necessarily result in better performance as the Redis server can only process a singlescan
command at a time.You might be able, as you’ve mentioned, to have multiple clients each using a different match pattern, and achieve some level of concurrency as each client will scan a different key subset out of all keys (note that even in this can, the Redis server will process a single command simultaneously, but the over all workload is distributed among multiple clients).
you can try to use an external script and run it inside redis-cli
for example
list.lua
then spawn new process using node
and run the following command
redis-cli EVAL "$(cat list.lua)"