This is a very good question, and understanding this is key to understanding why asynchronous IO is so important. The reason the new async / await function was added in C # 5.0 is to make writing asynchronous code easier. Support for asynchronous processing on the server is not new, but exists with ASP.NET 2.0.
As Steve showed you, with synchronous processing, each request in ASP.NET (and WCF) takes one thread from the thread pool. The problem he demonstrated is a well-known problem called the thread pool puzzle . If you perform synchronous I / O on your server, the thread pool thread will remain blocked (doing nothing) for the duration of the I / O. Since the number of threads in the thread pool is limited, this can lead to a situation when all threads in the thread pool are blocked waiting for I / O and requests are queued, which increases the response time. Since all threads are waiting for I / O to complete, you will see that the processor is close to 0% (even if the response time passes through the roof).
What are you asking ( Why can't we just use a longer stream? ) Is a very good question. In fact, this is how most people solve the problem of thread pool starvation so far: just more threads in the thread pool. Some Microsoft docs even point out that as a correction to a situation where thread pool hunger could occur. This is an acceptable solution, and before C # 5.0, it was much easier to do this than rewriting code completely asynchronous.
There are several problems with the approach:
There is no value that works in all situations : the number of thread stream threads that you need depends linearly on the I / O duration and load on your server. Unfortunately, I / O latency is mostly unpredictable. Here's an example: Let's say you make HTTP requests to a third-party web service in your ASP.NET application, which takes about 2 seconds to complete. You are facing thread pool starvation, so you decide to increase the thread pool size, say, 200 threads, and then it starts working fine again. The problem is that perhaps next week the web service will have technical problems that increase the response time to 10 seconds. All of the sudden starvation of the thread pool is returned because the threads are blocked 5 times longer, so now you need to increase the number 5 times to 1000 threads.
Scalability and performance . The second problem is that if you do this, you will still use one thread for each request. Topics are an expensive resource. Each managed thread in .NET requires 1 MB of memory per stack. For a web page that creates IO, which lasts 5 seconds and with a load of 500 requests per second, you need 2,500 threads in the thread pool, which means 2.5 GB of memory for thread stacks that won't do anything. Then you will have a context switching problem that will affect the performance of your computer (affecting all the services on the computer, not just your web application). Although Windows does a pretty good job of ignoring pending threads, it is not designed to handle so many threads. Remember that maximum efficiency is achieved when the number of threads equal to the number of logical processors on the machine (usually no more than 16).
Thus, increasing the size of the thread pool is a solution, and people have been doing it for a decade (even in Microsoft's own products), it is less scalable and efficient in terms of memory and processor use, and you are always at the mercy of a sudden increase in input delay - an output that can cause hunger. Prior to C # 5.0, the complexity of asynchronous code was not a big problem for many people. async / await changes everything just like now, you can take advantage of the scalability of asynchronous I / O and write simple code at the same time.
More details: http://msdn.microsoft.com/en-us/library/ff647787.aspx "Use asynchronous calls to call web services or remote objects when it is possible to perform additional parallel processing while continuing to call the web service. If possible avoid synchronous (blocking) calls to web services because outgoing calls to web services are made using threads from the ASP.NET thread pool. Blocking calls reduces the number of threads available to handle other incoming requests.
Flavien Feb 27 2018-12-12T00: 00Z
source share