Multiple CachePut operations per Method Invocation in Spring Caching

556 Views Asked by At

I have the following caching problem:

// imagine we have a method that performs a long running batch query List<Record> fetchRecords(List<String> recordKeys)

We would like to cache the results of this long running operation using the individual record key as the cache key instead of the whole list

The benefit of this outcome is obvious. Namely, upon the next invocation of fetchRecords(overlappingRecordKeys) the long running batch query will only include those keys that have not been previously included. This is a benefit we don't get by testing equality on the entire List.

This cannot be solved via keyGenerator interceptor as that mechanism only allow us to return a single cache key per invocation of the method

What is the best approach for solving this problem? I can think of two solutions

Solution 1

Create a CachingAspect

// pseudo code
@Aspect
class CachingAspect {

  CacheManager mgr = ...

  @Around("somePointcut")
  void checkCache(ProceedingPointCut pc) {

    // grab args from 'pc' which is of type List<String>
    // check against 'mgr', modify the argument to exclude those records already in cache
    // capture the output (i.e. new records from long-running operation), append with records retrieved from cache, return the union

  }
}

Solution 2

Same idea as above without AOP (put the code in fetchRecords() directly. This has the benefit of added type safety at the cost of elegance

Which solution is better? Or is there a third superior approach?

1

There are 1 best solutions below

0
On

The question about bulk operations and the spring cache abstraction pops up quite often. This is not covered.

... at the cost of elegance

Well, elegance is a matter of taste. Striving for both, elegance (via AOP) and efficiency is probably a matter of headaches ;)

The straight forward solution without annotation and is a read through configuration with a cache loader. A JCache/JSR107 compatible cache, for example, provides you with a Cache.getAll(keys) operation which is exactly you ask for.

Side node: What actually happens when the bulk request is executed will vary from cache implementation to cache implementation and is not trivial. For example, if you want blocking per key to avoid fetching a value for one key many times in parallel, you also need to protect against deadlocks.