skip to Main Content

I have a basic Web API written in Node.js that writes an object as an HSET to a Redis cache. Both are running in docker containers.

I have a Python script running on the same VM which needs to watch the Redis cache and then run some code when there is a new HSET or a change to a field in the HSET.

I came across Redis Pub/Sub but I’m not sure if this is really the proper way to use it.

To test, I created two Python scripts. The first subscribes to the messaging system:

import redis
import json

print ("Redis Subscriber")

redis_conn = redis.Redis(
    host='localhost',
    port=6379, 
    password='xxx', 
    charset="utf-8", 
    decode_responses=True)

def sub():
    pubsub = redis_conn.pubsub()
    pubsub.subscribe("broadcast")
    for message in pubsub.listen():
        if message.get("type") == "message":
            data = json.loads(message.get("data"))
            print(data)

if __name__ == "__main__":
    sub()

The second publishes to the messaging system:

import redis
import json

print ("Redis Publisher")

redis_conn = redis.Redis(
    host='localhost',
    port=6379, 
    password='xxx', 
    charset="utf-8", 
    decode_responses=True)

def pub():
    data = {
        "message": "id:3"
    }
    redis_conn.publish("broadcast", json.dumps(data))

if __name__ == "__main__":
    pub()

I will rewrite the publisher in Node.js and it will simply published the HSET key, like id:3. Then the subscriber will run in Python and when it received a new message, it will use that HSET key "id:3" to look up the actual HSET and do stuff.

This doesn’t seem like the right way to do this but Redis watch doesn’t support HSET. Is there a better way to accomplish this?

2

Answers


  1. This doesn’t seem like the right way to do this but Redis watch doesn’t support HSET.

    Redis WATCH does support hash keys – while it does not support hash fields.

    Is there a better way to accomplish this?

    While I believe your approach may be acceptable for certain scenarios, pub/sub messages are fire-and-forget: your subscriber may disconnect for whatever reason right after the publisher has published a message but before having the chance to read it – and your object write will thus be lost forever, even if the subscriber automatically reconnects after that.

    You may opt instead for Redis streams, which allow to add entries to a given stream (resembling the publishing process of your code) and consume them (akin your subscriber script), through a process which preserves the messages.

    As an alternative, perhaps simpler, approach, you may just split your hashes into multiple keys, one per field, so that you can WATCH them.

    Login or Signup to reply.
  2. You might want to take a look at key-space notifications. Key-space notifcations can automatically publish messages when via PubSub when a key is changed, added, deleted, etc.

    You can choose to consume events, i.e. HSET was called, and be provided the keyname it was called upon. Or, you can choose to consume keys, i.e my:awesome:key, and be notified with what event happened. Or both.

    You’ll need to turn key-space notifications on in order to use them:

    redis.cloud:6379> CONFIG SET notify-keyspace-events KEA
    

    You can subscribe to all events and keys like this:

    redis.cloud:6379> PSUBSCRIBE '__key*__:*'
    "pmessage","__key*__:*","__keyspace@0__:foo","set"
    "pmessage","__key*__:*","__keyevent@0__:set","foo"
    

    Hope that helps!

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search