TaskFactory.StartNew & # 8594; System.OutOfMemoryException

About 1,000 tasks work there, but sometimes I get the following memory exception thrown by the task scheduler. What could be the reason and how to avoid it.

System.Threading.Tasks.TaskSchedulerException: An exception was thrown by a TaskScheduler. ---> System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown. at System.Threading.Thread.StartInternal(IPrincipal principal, StackCrawlMark& stackMark) at System.Threading.Thread.Start(StackCrawlMark& stackMark) at System.Threading.Thread.Start(Object parameter) at System.Threading.Tasks.ThreadPoolTaskScheduler.QueueTask(Task task) at System.Threading.Tasks.Task.ScheduleAndStart(Boolean needsProtection) --- End of inner exception stack trace --- at System.Threading.Tasks.Task.ScheduleAndStart(Boolean needsProtection) at System.Threading.Tasks.Task.InternalStartNew(Task creatingTask, Object action, Object state, CancellationToken cancellationToken, TaskScheduler scheduler, TaskCreationOptions options, InternalTaskOptions internalOptions, StackCrawlMark& stackMark) at System.Threading.Tasks.TaskFactory.StartNew(Action action, CancellationToken cancellationToken, TaskCreationOptions creationOptions, TaskScheduler scheduler) at App.StartReadSocketTask() 
+4
source share
4 answers

In your (non x64) application, the maximum memory space is 2 GB. Each thread requires a minimum of 1 MB, typically you can expect OOM before reaching 1000 threads.

The Task class itself should solve this (using ThreadPool). But when your Tasks take too much time (> 500 ms), TP slowly adds Threads, failing after a few minutes or longer.

The simplest solution would be to look at your code in which this unlimited creation of Jobs takes place and see if you can limit it in a way that is consistent with your decision. For example, if you use Queue Producer / Consumer, make it limited.

Otherwise, limit MaxThreads, but it is a dumb, application tool.

+8
source

I believe that you came across an interesting part of ThreadPool , where you decided to add more workflows because your current tasks are starving expectations. Ultimately, this causes your application to run out of memory.

I suggest adding the TaskCreationOptions.LongRunning flag when creating. This will let ThreadPool know that it should consider re-subscribing tasks.

From the book Parallel Programming with Microsoft.Net :

As a final result, you can use the SetMaxThreads method to set the ThreadPool class with an upper limit for the number of worker threads, usually equal to the number of cores (this is the Environment.ProcessorCount property) ...

The same book also recommends How to create a task scheduler that limits the degree of Concurrency .

+4
source

As I experimented with checking the limits of a parallel system, I myself ran into this problem. Comment oleksii is a spot on (1k threads ~ = 1GB of memory). It is important to note that virtual memory space is reserved in this memory, and not the number of actually used memory. Memory exceptions occur when the system cannot transfer a continuous fragment of the virtual address space large enough to satisfy your request (insert rhetoric of memory fragmentation here). If you look at the process in the Windows task manager about the time of his death, you can see only 80-120 mb of "used" memory. To find out how much virtual address space is reserved, display the "Memory - Commit Size" column in the task manager.

To keep this short, I was able to break through the ~ 1k stream limit by switching the build configuration from x86 to 64 bits. This increases the amount of virtual address space available from (approximately) 2 GB to 6 TB + (depending on OS version), and my OutOfMemoryException is gone.

Here is a simple program that I created that illustrates this artifact, be sure to run it as x86 and see how it dies somewhere between the 1k and 1,5k threads, and then switch to 64-bit and it should work until it finishes without failures.

 using System; using System.Collections.Generic; using System.Threading.Tasks; using System.Threading; namespace TaskToy { class Program { static void Main( string[] args ) { List<Task> lTasks = new List<Task>(); int lWidth = 0; for ( int i = 0; i < 5000; i ++ ) { lTasks.Add( new Task( (o) => { Console.WriteLine( "B " + Interlocked.Increment( ref lWidth ) + " tid " + Thread.CurrentThread.ManagedThreadId ); Thread.Sleep( 60000 ); Console.WriteLine( "E " + Interlocked.Decrement( ref lWidth ) + " tid " + Thread.CurrentThread.ManagedThreadId ); }, null, TaskCreationOptions.LongRunning ) ); } Parallel.For( 0, lTasks.Count, ( i ) => { lTasks[i].Start(); } ); Task.WaitAll( lTasks.ToArray() ); Console.WriteLine( "DONE - press any key..." ); Console.ReadKey( true ); } } } 

PS The variable 'lWidth' indicates the current level of concurrency, that is, how many tasks are actually being performed at the same time.

In general, it was a fun academic experiment, but it will probably be β€œseveral” years before launching thousands of threads, which will bring a profitable income. It is probably advisable to limit the number of threads that you switch to something more practical - probably an order of magnitude less than "thousands".

+4
source

You are probably running too many tasks at the same time.

Each task potentially represents a separate thread. The CLR assigns an independent stack stack to each thread. I assume a typical stack takes up 1024Kb for x64 Windows. Just spanning streams, you get 1 GB of memory exclusively for stream stacks. This does not include heap memory and a bunch of large objects. In addition, you have other processes that consume memory.

+3
source

All Articles