Correct lock execution in ASP.NET

I have an ASP.NET site with a rather slow search function, and I want to improve performance by adding results to the cache within one hour, using the query as a cache key:

using System; using System.Web; using System.Web.Caching; public class Search { private static object _cacheLock = new object(); public static string DoSearch(string query) { string results = ""; if (HttpContext.Current.Cache[query] == null) { lock (_cacheLock) { if (HttpContext.Current.Cache[query] == null) { results = GetResultsFromSlowDb(query); HttpContext.Current.Cache.Add(query, results, null, DateTime.Now.AddHours(1), Cache.NoSlidingExpiration, CacheItemPriority.Normal, null); } else { results = HttpContext.Current.Cache[query].ToString(); } } } else { results = HttpContext.Current.Cache[query].ToString(); } return results; } private static string GetResultsFromSlowDb(string query) { return "Hello World!"; } } 

Suppose visitor A searches. The cache is empty, the lock is set, and the result is queried from the database. Now visitor B comes up with a different search: Wont visitor B must wait for the castle until the visitor. How is the search completed? I really wanted B to call the database right away, because the results will be different, and the database can handle multiple queries - I just don't want to repeat expensive unnecessary queries.

What would be the right approach for this scenario?

+7
source share
3 answers

If you are absolutely sure that it is critical not to have redundant requests, I would generally avoid blocking. ASP.NET cache is inherently thread safe, so the only drawback of the following code is that you can temporarily view a few redundant requests that race each other when their associated cache expires:

 public static string DoSearch(string query) { var results = (string)HttpContext.Current.Cache[query]; if (results == null) { results = GetResultsFromSlowDb(query); HttpContext.Current.Cache.Insert(query, results, null, DateTime.Now.AddHours(1), Cache.NoSlidingExpiration); } return results; } 

If you decide that you really need to avoid all redundant requests, you can use a set of more granular locks, one lock for each request:

 public static string DoSearch(string query) { var results = (string)HttpContext.Current.Cache[query]; if (results == null) { object miniLock = _miniLocks.GetOrAdd(query, k => new object()); lock (miniLock) { results = (string)HttpContext.Current.Cache[query]; if (results == null) { results = GetResultsFromSlowDb(query); HttpContext.Current.Cache.Insert(query, results, null, DateTime.Now.AddHours(1), Cache.NoSlidingExpiration); } object temp; if (_miniLocks.TryGetValue(query, out temp) && (temp == miniLock)) _miniLocks.TryRemove(query); } } return results; } private static readonly ConcurrentDictionary<string, object> _miniLocks = new ConcurrentDictionary<string, object>(); 
+25
source

There is a potential race condition in your code:

 if (HttpContext.Current.Cache[query] == null) { ... } else { // When you get here, another thread may have removed the item from the cache // so this may still return null. results = HttpContext.Current.Cache[query].ToString(); } 

In general, I would not use a lock and would do it as follows to avoid a race condition:

 results = HttpContext.Current.Cache[query]; if (HttpContext.Current.Cache[query] == null) { results = GetResultsFromSomewhere(); HttpContext.Current.Cache.Add(query, results,...); } return results; 

In the above case, multiple threads may try to load data if they detect a missed cache at about the same time. In practice, this is likely to be rare, and in most cases inconsequential, since the data they download will be equivalent.

But if you want to use a lock to prevent this, you can do it like this:

 results = HttpContext.Current.Cache[query]; if (results == null) { lock(someLock) { results = HttpContext.Current.Cache[query]; if (results == null) { results = GetResultsFromSomewhere(); HttpContext.Current.Cache.Add(query, results,...); } } } return results; 
+8
source

Your code is correct . You also use double-if-sandwitching-lock , which will prevent race conditions that are a common mistake when not in use. This will not block access to existing materials in the cache.

The only problem is that many clients put into the cache at the same time, and they will queue for the lock, but I would put results = GetResultsFromSlowDb(query); out of lock:

 public static string DoSearch(string query) { string results = ""; if (HttpContext.Current.Cache[query] == null) { results = GetResultsFromSlowDb(query); // HERE lock (_cacheLock) { if (HttpContext.Current.Cache[query] == null) { HttpContext.Current.Cache.Add(query, results, null, DateTime.Now.AddHours(1), Cache.NoSlidingExpiration, CacheItemPriority.Normal, null); } else { results = HttpContext.Current.Cache[query].ToString(); } } } else { results = HttpContext.Current.Cache[query].ToString(); } 

If it is slow, your problem is elsewhere.

0
source

All Articles