The threadpool initiative is already doing a pretty good job of this. It tries to limit the number of running threads to the number of processor cores on your computer. When one thread ends, it immediately schedules another suitable thread for execution.
Every 0.5 seconds it evaluates what happens to the running threads. When threads run for too long, they are assumed to be at a standstill and allow another thread to run. You will now have more threads than processors. This can reach the maximum number of allowed threads, as set by ThreadPool.SetMaxThreads ().
Starting with .NET 2.0 SP1, the maximum number of threads by default has been significantly increased up to 250 times compared to the number of cores. You should never get there. If you did, you would have wasted about 2 minutes of time when a suboptimal number of threads was possible. However, these threads should have been blocked for this long, rather than the typical execution pattern for the thread. On the other hand, if these threads are all waiting on the same resource, they are likely to take turns adding more threads in turn and will not be able to increase throughput.
In short, a thread pool will work well if you run threads that run fast (no more than seconds) and do not block for a long time. You should probably consider creating your own Thread objects when your code doesn't match this pattern.
Hans passant
source share