A few Async expectations in .Net WebApi

We have several asynchronous controllers and services that use the await / async keywords.

A few actions look a bit:

 public async Task<SomeViewModel> Get(int id) { var someData = await _service.GetData(id); var someOtherData = await _service.GetMoreData(id); return new SomeViewModel { Data = someData, OtherData = someOtherData, } } 

It is possible that the service call also has multi-user await s. await will usually be opposed to calling async on an entity infrastructure, service bus, or third-party web endpoint.

One of my colleagues today came to the conclusion that such code was meaningless, that it would simply create additional work for managing threads and under load, in fact, we will generate more work for the runtime and slow down the application as a result.

Are they correct, and if so, what is considered best practice for async / await when you have multiple related IO calls in a web API request?

+6
source share
3 answers

One of my colleagues today came to the conclusion that such code was meaningless, that it would simply create additional work for managing threads and under load, in fact, we will generate more work for the runtime and slow down the application as a result.

This is funny, as the opposite. As other responders noted, if you use true asynchronous operations (i.e. Not Task.Run or something like that), then fewer threads are used, and the application responds better to the load.

Some people (not me) did research on β€œmedium” ASP.NET applications, switching to async , and they found scalability increases from 10x to 100x when switching to async as opposed to blocking calls. You can expect better scalability if your application has more asynchronous operation.

If you look at one request and if each operation is performed one at a time, then the asynchronous version is a bit slower. But if you look at the system as a whole - especially under load - the asynchronous version scales better. Another aspect of asynchronous handlers that are often overlooked is that the asynchronous version responds more quickly to sudden loads than the thread pool itself.

In addition, asynchronous code simplifies the execution of parallel queries, which also speeds up the execution of individual queries:

 public async Task<SomeViewModel> Get(int id) { var someDataTask = _service.GetData(id); var someOtherDataTask = _service.GetMoreData(id); await Task.WhenAll(someDataTask, someOtherDataTask); return new SomeViewModel { Data = await someDataTask, OtherData = await someOtherDataTask, } } 
+10
source

If you use only one request at a time in your application pool, a more efficient solution will block.

If you have almost all parallelism at all, async / await is likely to be more efficient because it will result in fewer threads and less context switching. Because of this, I / O-bound workloads (where context switching is very likely if you block) are actually one of the places where async / await is best suited.

As boklucius replied, asynchronous I / O in .NET targets the I / O completion port under covers, which uses a thread pool to block I / O and to complete I / O. Using async definitely does not inflate your thread.

+2
source

You can take a look at Async Performance: Understanding the Costs of Async and Await by Stephen Toub .

I / O is performed with termination ports, not multiple threads; the switching effect should be negligible. As always, it depends on the context if in doubt.

My own experience has shown that the overhead is not so great and its ease of use (our use case, a tcp server for a user obscure protocol with several slow clients at the same time). It was a rewrite of a C ++ / threads / COM / win32 application, the .net version using awight await / async has 3x bandwidth per 1/5 line of code, but as I said, it depends.

+1
source

All Articles