Using TPL how to set maximum thread size

I use TPL to add new tasks to the system thread pool using the Task.Factory.StartNew() function. The only problem is that I am adding a lot of threads and I think this creates too much to process my processor. Is there a way to set the maximum number of threads in this thread pool?

+7
source share
4 answers

By default, TaskScheduler (derived from TaskScheduler.Default ) is of type (inner class) ThreadPoolTaskScheduler . This implementation uses the ThreadPool class for queue tasks (if Task not created using TaskCreationOptions.LongRunning ), in this case, a new thread is created for each task).

So, if you want to limit the number of threads available to Task objects created using new Task(() => Console.WriteLine("In task")) , you can limit the available threads in the global thread as follows:

 // Limit threadpool size int workerThreads, completionPortThreads; ThreadPool.GetMaxThreads(out workerThreads, out completionPortThreads); workerThreads = 32; ThreadPool.SetMaxThreads(workerThreads, completionPortThreads); 

A call to ThreadPool.GetMaxThreads() is performed to avoid shortening completionPortThreads .

Please note that this can be a bad idea - since all tasks without the specified scheduler and any number of other classes use ThreadPool by default, setting it too low can cause side effects: starvation, etc.

+12
source

Typically, TPL determines a good default stream size. If you really need fewer threads, see How to create a task scheduler that limits the degree of Concurrency.

+5
source

You must first investigate your performance problems. There are various problems that can lead to reduced usage:

  • Planning for long-term tasks without the LongRunningTask option
  • Trying to open more than two simultaneous connections to the same web address
  • Lock to access the same resource
  • Attempting to access UI thread using Invoke () from multiple threads

In any case, you have a scalability problem that cannot be solved simply by reducing the number of simultaneous tasks. In the future, your program may run on dual, quad, or octa-core machines. Limiting the number of scheduled tasks will simply result in CPU consumption.

+3
source

Normally, the TPL scheduler should be good at how many tasks run at the same time, but if you really want to control it, My blog shows how to do this with both tasks and actions, and provides an example project that you can load and run, to see how in action.

An example of when you can explicitly limit the number of tasks at the same time is when you call your own services and do not want to overload your server.

For what you are describing, it looks like you can take more advantage of using async / wait with your tasks to prevent unnecessary thread consumption. It will depend on whether you are working with a processor or working with IO in your tasks. If it is related to IO binding, then you can benefit from using async / wait.

Regardless, you asked a question about how to limit the number of tasks performed simultaneously, so here is the code that shows how to do this using both actions and tasks.

Using action

When using actions, you can use the built-in .Net Parallel.Invoke function. Here we restrict it to running no more than 3 threads in parallel.

 var listOfActions = new List<Action>(); for (int i = 0; i < 10; i++) { // Note that we create the Action here, but do not start it. listOfActions.Add(() => DoSomething()); } var options = new ParallelOptions {MaxDegreeOfParallelism = 3}; Parallel.Invoke(options, listOfActions.ToArray()); 

With tasks

Since you are using Tasks here, there is no built-in function. However, you can use the one I provide on my blog.

  /// <summary> /// Starts the given tasks and waits for them to complete. This will run, at most, the specified number of tasks in parallel. /// <para>NOTE: If one of the given tasks has already been started, an exception will be thrown.</para> /// </summary> /// <param name="tasksToRun">The tasks to run.</param> /// <param name="maxTasksToRunInParallel">The maximum number of tasks to run in parallel.</param> /// <param name="cancellationToken">The cancellation token.</param> public static void StartAndWaitAllThrottled(IEnumerable<Task> tasksToRun, int maxTasksToRunInParallel, CancellationToken cancellationToken = new CancellationToken()) { StartAndWaitAllThrottled(tasksToRun, maxTasksToRunInParallel, -1, cancellationToken); } /// <summary> /// Starts the given tasks and waits for them to complete. This will run, at most, the specified number of tasks in parallel. /// <para>NOTE: If one of the given tasks has already been started, an exception will be thrown.</para> /// </summary> /// <param name="tasksToRun">The tasks to run.</param> /// <param name="maxTasksToRunInParallel">The maximum number of tasks to run in parallel.</param> /// <param name="timeoutInMilliseconds">The maximum milliseconds we should allow the max tasks to run in parallel before allowing another task to start. Specify -1 to wait indefinitely.</param> /// <param name="cancellationToken">The cancellation token.</param> public static void StartAndWaitAllThrottled(IEnumerable<Task> tasksToRun, int maxTasksToRunInParallel, int timeoutInMilliseconds, CancellationToken cancellationToken = new CancellationToken()) { // Convert to a list of tasks so that we don&#39;t enumerate over it multiple times needlessly. var tasks = tasksToRun.ToList(); using (var throttler = new SemaphoreSlim(maxTasksToRunInParallel)) { var postTaskTasks = new List<Task>(); // Have each task notify the throttler when it completes so that it decrements the number of tasks currently running. tasks.ForEach(t => postTaskTasks.Add(t.ContinueWith(tsk => throttler.Release()))); // Start running each task. foreach (var task in tasks) { // Increment the number of tasks currently running and wait if too many are running. throttler.Wait(timeoutInMilliseconds, cancellationToken); cancellationToken.ThrowIfCancellationRequested(); task.Start(); } // Wait for all of the provided tasks to complete. // We wait on the list of "post" tasks instead of the original tasks, otherwise there is a potential race condition where the throttler&#39;s using block is exited before some Tasks have had their "post" action completed, which references the throttler, resulting in an exception due to accessing a disposed object. Task.WaitAll(postTaskTasks.ToArray(), cancellationToken); } } 

And then, having created your task list and calling the function so that they start, say, no more than three simultaneous at a time, you can do this:

 var listOfTasks = new List<Task>(); for (int i = 0; i < 10; i++) { var count = i; // Note that we create the Task here, but do not start it. listOfTasks.Add(new Task(() => Something())); } Tasks.StartAndWaitAllThrottled(listOfTasks, 3); 
-one
source

All Articles