I read all the related questions in SO, but got a little confused about the best approach to my scenario when several web service calls fired.
I have an aggregator service that accepts input, parses and translates it into multiple web requests, makes web request requests (unrelated, so they can be run in parallel), and combines the response that is sent back to the caller. The following code is in use right now -
list.ForEach((object obj) => { tasks.Add(Task.Factory.StartNew((object state) => { this.ProcessRequest(obj); }, obj, CancellationToken.None, TaskCreationOptions.AttachedToParent, TaskScheduler.Default)); }); await Task.WhenAll(tasks);
await Task.WhenAll(tasks) comes from the Scott Hanselman post , which states that
"The best solution in terms of scalability, says Stephen, is to use asynchronous I / O. When you call through the network there is no reason (other than convenience) to block threads, waiting for a response to return"
Existing code consumes too many threads, and CPU time increases up to 100% the workload, and it makes me think.
Another alternative is to use Parallel.ForEach, which the sectionist uses, but also “blocks” the call, which is great for my scenario.
Given that all this is “Async IO work” and not “processor-related” work, and web requests do not work for a long time (return no more than 3 seconds), I am inclined to believe that the existing code is good enough. But it will provide better bandwidth than Parallel.ForEach? Parallel.ForEach probably uses the minimum number of tasks due to the partitioning and, therefore, the optimal use of threads (?). I tested Parallel.ForEach with some local tests and it seems not to be better.
The goal is to reduce processor time and increase throughput and therefore increase scalability. Is there a better approach to parallel processing of web requests?
Rate any inputs, thanks.
EDIT: The ProcessRequest method shown in the sample code does indeed use HttpClient and its asynchronous methods for send requests (PostAsync, GetAsync, PutAsync).