Caching tasks when performing tasks in parallel with WhenAll

So, I have this small block of code that will perform several tasks in parallel.

// no wrapping in Task, it is async
var activityList = await dataService.GetActivitiesAsync();

// Select a good enough tuple
var results = (from activity in activityList
               select new { 
                Activity = activity, 
                AthleteTask = dataService.GetAthleteAsync(activity.AthleteID)
               }).ToList(); // begin enumeration

// Wait for them to finish, ie relinquish control of the thread
await Task.WhenAll(results.Select(t => t.AthleteTask));

// Set the athletes
foreach(var pair in results)
{
  pair.Activity.Athlete = pair.AthleteTask.Result;
}

So, I load the athlete data for each given action. But maybe we ask one athlete several times. How can we guarantee that the GetAthleteAsync method will only work online to receive actual data if it is not already in our memory cache?

I have currently tried using ConcurrentDictionary<int, Athelete>GetAthleteAsync inside the method

private async Task<Athlete> GetAthleteAsync(int athleteID)
{
       if(cacheAthletes.Contains(athleteID))
             return cacheAthletes[atheleID];

       ** else fetch from web
}
+4
source share
3 answers

ConcurrentDictionary Task<Athlete> Athlete. , Task<T> - - , T. , .

ConcurrentDictionary<int, Task<Athlete>> cacheAthletes;

: , (). , , . , :

private Task<Athlete> GetAthleteAsync(int athleteID)
{
  return cacheAthletes.GetOrAdd(athleteID, id => LoadAthleteAsync(id));
}

private async Task<Athlete> LoadAthleteAsync(int athleteID)
{
  // Load from web
}

, Task<Athlete>, .

+4

, . :

ObjectCache _cache = MemoryCache.Default;
static object _lockObject = new object();
public Task<T> GetAsync<T>(string cacheKey, Func<Task<T>> func, TimeSpan? cacheExpiration = null) where T : class
{
    var task = (T)_cache[cacheKey];
    if (task != null) return task;          
    lock (_lockObject)
    {
        task = (T)_cache[cacheKey];
        if (task != null) return task;
        task = func();
        Set(cacheKey, task, cacheExpiration);
        task.ContinueWith(t => {
            if (t.Status != TaskStatus.RanToCompletion)
                _cache.Remove(cacheKey);
        });
    }
    return task;
}i
+1

, Task, , , :

  • . GetAthleteAsync .
  • (, ), , , - .
  • , .

I have a blog post about caching task objects with sample code that provides all the points above and can be useful in your situation. Basically my solution is to store objects Lazy<Task<T>>in MemoryCache.

0
source

All Articles