Implement streaming secure caching of search results

How Search Caching Works

When the user enters query to search:

  • The request is divided into an array of tokens
  • A unique hash is created for this token array (tokens in alphabetical order, then MD5). This is a unique search identifier.
  • Check cache for hash based results
  • If the cache does not exist, save the results in the cache using hash

The problem I'm trying to solve

If a user performs a search that takes 10 seconds and they eagerly refresh the page, we don’t want him to run the query again. This should be blocked.

However, if an expensive query is being executed, we do not want to block other users performing less expensive searches.

To solve this problem, I need some locks.

Implementation

Here's how I implemented it now:

 private static readonly object MasterManualSearchLock = new object(); private static readonly Dictionary<string, object> ManualSearchLocks = new Dictionary<string, object>(); /// <summary> /// Search the manual /// </summary> public static SearchResponse DoSearch(string query, Manual forManual) { var tokens = Search.Functions.TokeniseSearchQuery(query); var tokenHash = Search.Functions.GetUniqueHashOfTokens(tokens); var cacheIndex = Settings.CachePrefix + "SavedManualSearch_" + tokenHash; var context = HttpContext.Current; if (context.Cache[cacheIndex] == null) { // Create lock if it doesn't exist if (!ManualSearchLocks.ContainsKey(tokenHash)) { lock (MasterManualSearchLock) { if (!ManualSearchLocks.ContainsKey(tokenHash)) { ManualSearchLocks.Add(tokenHash, new object()); } } } lock (ManualSearchLocks[tokenHash]) { if (context.Cache[cacheIndex] == null) { var searchResponse = new SearchResponse(tokens, forManual, query); context.Cache.Add(cacheIndex, searchResponse, null, DateTime.Now.AddMinutes(Settings.Search.SearchResultsAbsoluteTimeoutMins), Cache.NoSlidingExpiration, CacheItemPriority.BelowNormal, null); } ManualSearchLocks.Remove(tokenHash); } } return (SearchResponse)context.Cache[cacheIndex]; } 

Questions

  • Is this a reasonable implementation?
  • Is this thread safe?
  • Does removing locks inside the lock itself include?
+5
source share
1 answer

Your simultaneous use of ManualSearchLocks unsafe, ConcurrentDictionary is a good replacement. No, just reading from a dictionary is unsafe because it is not documented in safety.

I would put Lazy<T> in the cache. There may be many such lazy instances, but only one will be implemented. All threads requiring access to a specific key will call Lazy.Value and automatically synchronize. As soon as one actual "search" is completed.

Depending on how you access the cache, there may be a small race condition that allows you to perform multiple lazy instances. This probably doesn't matter much in your case.

0
source

All Articles