We are using Redis with Node in our product to speed up the APIs when we get the data, but thinking about it a little we might not be using it the best way.
We have a function that retrives the list from the database and if that list isn't already stored in Redis it requests the data. That function accepts multiple parameters that directly change the SQL query. Then we basically save data on Redis with the SQL query as the "key" and the JSON string response as the "data" with the following piece of code:
await redisClient.setEx(key, ttl, JSON.stringify(object))
The problem is that the lists can contain duplicates. ex: in 100 lists we have the same 80 item.
One possible new approach could be the following: save every single item on Redis and, when we get the data, do a map of ids that we retrived from the database (or another Redis list) and get every single item from Redis.
Do you have some suggestions or experiences on how to resolve the said problem?