I'm trying to use redis to cache time series data of 270 stocks. Every 2 or 3 seconds I get an array of stock changes (trades) that have just happened. I want to save this data in redis so I'm currently trying to think of the best (and most efficient) way to do so.
First I considered having 270 lists in redis where each list is one of the stocks that could get updated and on any update, I add the object to the corresponding list. This has 2 main problems, lets say one of the updates has 10 different stocks that just changed, this means that I'll have to communicate with redis 10 times. The other problem is retrieval, if I want to get the data of all the stocks, then I'll have to communicate with redis 270 times.
The other approach would be to just have one hash which maps to a JSON object with 270 keys, and each value in the object would be an array of updates.
I'm currently favoring the second approach but I'm wondering if theres something else I can do that may be better than these approaches?
It depends the qps (query per seconds) .
If it's very high , the second solution has some problems , it doesn't scale ; Multiple servers will query the same key serially ( because single thread of redis). Maybe you can try the timer task which pull the data to local cache periodically.