I have a use case when the method of loading cache data is a mass call, but I will never use getAll to get data from the cache. Is there a way for multiple concurrent all to block on one loadAll? I donโt want individuals on different keys to make multiple calls to the data source.
cache.get(key1);
I think that I will have to implement my own synchronization after searching in LocalCache, using something like a local cache in my data accessory, which allows you to call only after so many units of time. When the call is made, update the local copy with a single assignment statement.
Am I missing something from the Guava cache library?
Edit:
I am considering something like the following. However, it can continue to return stale data, and loadAll ends. I would prefer all the blocks in load , and only the first request calls loadAll to continue.
public class DataCacheLoader extends CacheLoader<String, Double> { private final Cache<String, Double> cache; private ConcurrentMap<String, Double> currentData; private final AtomicBoolean isloading; public DataCacheLoader( final Cache<String, Double> cache ) { this.cache = cache; isLoading = new AtomicBoolean( false ); } @Override public Double load( final String key ) throws Exception { if ( isLoading.compareAndSet( false, true ) ) { cache.putAll( loadAll( Lists.newArrayList( key ) ) ) ); } return currentData.get( key ); } @Override public Map<String, Double> loadAll(Iterable<? extends String> keys) throws Exception { currentData = source.getAllData(); return currentData; } }
source share