Given that I have a service P (Producer) and C (Consumer) my P service needs to:
- Create an object X
- Create an object Y (dependent on X)
- Create an object Z (dependent on Y)
- Notify C about X, Y, and Z (via Redis Streams)
- C needs to use data from Z, Y, and X to do some local data persistence
- Updated to Y and fairly common, but to X are rare
From C’s perspective, is there a way to guarantee that it had all the info it needed for successful persistence?
I know that services like Kafka and Redis Streams are not generally built for this stuff, but how does one overcome this?
Idea 1:
- Send X, Y, and Z in that particular order to the same consumer group. But if we scale the number of workers to anything above 1, we run into the problem
Idea 2:
- Instead of sending X and Y separately to C, I can send a compound object Z, which has Y and X embedded. But seems really like overkill – doesn’t it?
Is there any obvious way to handle object dependencies?
2
Answers
This is a good question covered in this note about Redis Streams.
So if you want to process them in order and use more than one consumer you will need to deal with that yourself.
I believe IDEA 2 is a better solution cuz I think keep the whole message in one data structure is a good idea.
And probably you can try to use multiple keys.
For example:
On Service P
On service C, create a new thread