I want to run a job in spring boot using quartz where multiple threads will execute the method.
What i want is to save the result in redis for every processing, so i can get idea how good job is working.
I want to save data in redis in this form.
{
"2020-04-20": [
{
"item_1": {
"success": "true",
"message": ""
}
},
{
"item_2": {
"success": "true",
"message": ""
}
}
]
}
I want to insert all the items in key date.
Since multiple threads are working , every thread is working on some item. So all item should be inserted into only key (date).
Is it possible?
one solution is to over-write the data of (date) key again and again , first getting data from redis, appending item on it and again saving the key in redis.
Is there another way , or using some annotation like @cacheable, @cacheput etc. so that i can create nested key. automatically item is appended in the (date) key.
2
Answers
I solved it using redis set functionality. I am using jedis client in my project.
This is what i needed. In my case date is the key, and other details (one object of json) is the member of the set. So, i convert my json to data to string when adding memeber in set, and when getting data i convert it back from string to json. This solved my problem.
Note:- There is also list functionality that can be used. But time complexities for list are not O(1). In my case i am sure i will not have duplicates so set works for me.
Have you considered RedisJSON?
Soemthing like this (I haven’t tested it, I don’t have RedisJSON handy)
JSON.ARRAPPEND
are supposed to be atomic.