I wrote a simple asynchronous boot library, and it also has a console interface for testing from the command line.
Basically, it launches a huge number of queries at the same time, combines them and shows a summary and a simple histogram. Nothing special. But I did a lot of tests on the local system, so I wanted to make sure that the test tool went out of the way for a relatively accurate test, using the least resource. Thus, the bare asynchronous Begin / End method is used to maintain the lowest cost.
Everything is done, completely asynchronously, it works, and gets out of the way (well, basically). But the number of threads in a regular session was more than 40. Thus, a really accurate loss of resources for a machine with 4 hardware threads, given that the tested server is also running on the local machine.
I am already running the program in AsyncContext, which basically is just a simple context, putting everything in one thread. So, all aync backlinks are in the main thread. Perfect.
Now all I have to do is limit the ThreadPool threads to the maximum and see how well it works. Limited to its actual core, with 4 workers and 4 IOCP threads.
Result
Exception: "ThreadPool does not have enough free threads to complete the operation.
Well, this is not a new problem, and it is quite scattered throughout the Internet. But aren't all ThreadPool goals that you can shift things to a pool queue, and it runs whenever a thread is available?
Infact, the name of the Queue method of UserWorkItem. And the documentation properly says: "The queues of the method to execute. The method is executed when the thread of the thread pool becomes available."
Now, if there are not enough free threads, ideally, the expected, perhaps, slowdown in program execution. IOCP and asynchronous tasks should simply be queued, but why is it implemented in such a way that it is confusing and instead fails? Increasing the number of threads is not a solution when it is called a ThreadPool for a queue.
Edit - Clarification:
I am fully aware of the concept of threadpool and why the CLR spins more threads. Must. I agree that this is really correct when there are difficult tasks related to IO. But the fact is that if you do infact threads in ThreadPool, it is expected that the task will be queued for when whenever a free thread is available, do not throw an exception. Concurrency may suffer, possibly even slowing down the result, but QueueWorkUserItem is intended for the queue, and not just for work when a new thread is available or fails - hence, my speculative statement is that its implementation error, as indicated in the title.
Update 1:
The same problem as in the Microsoft Support Forums with an example: http://support.microsoft.com/default.aspx?scid=kb;EN-US;815637
A workaround, suggested obviously for increasing the number of threads, since it cannot be queued.
Note. This is in a very old runtime, and the method for reproducing the same problem in runtime 4.5.1 is given below.
Update 2:
Ran is the same code snippets in Mono Runtime, and ThreadPool seems to have no problem . He gets in line and runs. The problem only occurs in the Microsoft CLR.
Update 3:
After @Noseratio pointed out the correct problem of being unable to reproduce the same code in .NET 4.5.1, the following is a snippet of code that will reproduce the problem. To break the code that works when queuing, as expected, all you really need to do is add a true asynchronous call to the delegate queue.
For example, simply adding the line below to the end of the delegate should end in an exception:
(await WebRequest.Create("http://www.google.com").GetResponseAsync()).Close();
Code to play:
Here is the code that has changed a bit from the MSKB article, and this should finish quickly in .NET 4.5.1 on Windows 8.1.
(Feel free to change the url and stream restrictions).
public static void Main() { ThreadPool.SetMinThreads(1, 1); ThreadPool.SetMaxThreads(2, 2); for (int i = 0; i < 5; i++) { Console.WriteLine("Queued {0}", i); ThreadPool.QueueUserWorkItem(PoolFunc); } Console.ReadLine(); } private static async void PoolFunc(object state) { int workerThreads, completionPortThreads; ThreadPool.GetAvailableThreads(out workerThreads, out completionPortThreads); Console.WriteLine( "Available: WorkerThreads: {0}, CompletionPortThreads: {1}", workerThreads, completionPortThreads); Thread.Sleep(1000); string url = "http://localhost:8080"; HttpWebRequest myHttpWebRequest;
Any understanding of this behavior that could lead to this is greatly appreciated. Thanks.