C # .NET Garbage collection not working?

I am working on a relatively large solution in Visual Studio 2010. It has various projects, one of which is an XNA Game-project and the other is ASP.NET MVC 2-project.

In both projects, I face the same problem: after starting in debug mode, memory usage continues to grow. They start with 40 and 100 MB of memory, respectively, but both rise by 1.5 GB relatively quickly (respectively 10 and 30 minutes). After that, it was sometimes discarded to get closer to the original use, and in other cases it just threw OutOfMemoryExceptions .

Of course, this would mean a serious memory leak, so I first tried to identify the problem. After unsuccessfully searching for leaks, I tried to call GC.Collect() regularly (about once every 10 seconds). After the introduction of this hack, memory usage remained at 45 and 120 MB, respectively, for 24 hours (until I stopped testing).

The .NET garbage collection should be "very good", but I cannot help but suspect that it simply is not doing its job. I used the CLR Profiler in an attempt to fix this problem, and it showed that the XNA project seemed to save a lot of byte arrays that I really used, but for which the links should already be removed and therefore garbage collected by the collector.

Again, when I regularly call GC.Collect() , the memory usage problems seem to be gone. Does anyone know what might cause excessive memory usage? Perhaps this is due to running in debug mode?

+7
source share
6 answers

After finding leak errors

Try harder =)

Managed language memory leaks can be difficult to track. I had a good impression of the Redgate ANTS Memory Profiler . It's not free, but they give you a 14-day full-featured trial. It has a nice interface and shows you where the memory is allocated and why these objects are stored in memory.

According to Alex, event handlers are a very common source of memory leaks in a .NET application. Consider this:

 public static class SomeStaticClass { public event EventHandler SomeEvent; } private class Foo { public Foo() { SomeStaticClass.SomeEvent += MyHandler; } private void MyHandler( object sender, EventArgs ) { /* whatever */ } } 

I used a static class to make the problem as obvious as possible. Let's say that during the life of your application many Foo objects were created. Each Foo subscribes to the SomeEvent event of a static class.

Foo objects can go out of scope at the same time, but the static class maintains a reference to each of them through an event handler delegate. Thus, they survive indefinitely. In this case, the event handler just needs to be β€œdetached”.

... the XNA project seems to have saved a lot of byte arrays that I really used ...

You may encounter fragmentation in LOH. If you select large objects very often, they can cause a problem. The total size of these objects can be much smaller than the total memory allocated for the runtime, but due to fragmentation, a lot of unused memory is allocated for your application.

The profiler associated with the above will tell you if this is a problem. If so, you can probably track it until the object leaks somewhere. I just fixed the problem in my application, showing the same behavior, and this was due to the fact that the MemoryStream did not release its internal byte[] even after calling Dispose() on it. Wrapping a stream in a dummy stream and fixing it fixed the problem.

Also, specifying the obvious, make sure that Dispose() objects that implement IDisposable . Local resources may exist. Again, a good profiler will catch that.

My suggestion; This is not a GC, the problem is in your application. Use the profiler, load the application into a high memory consumption state, take a memory snapshot, and start the analysis.

+11
source

First of all, GC works and works well. There are no errors that you just discovered.

Now that we've got it aside, some thoughts:

  • Are you using too many threads?
  • Remember that GC is not deterministic; it will run whenever it thinks it should work (even if you call GC.Collect() .
  • Are you sure all your links go beyond?
  • What do you load into memory first? Large images? Large text files?

Your profiler should tell you to use that much memory. Start cutting off the biggest criminals as you can.

Also, calling GC.Collect () every X seconds is a bad idea and is unlikely to solve your real problem.

+5
source

Analyzing memory problems in .NET is not a trivial task, and you should definitely read some good articles and try various tools to achieve the result. As a result, I got the following article after research: http://www.alexatnet.com/content/net-memory-management-and-garbage-collector You can also read some Jeffrey Richter articles, for example: http://msdn.microsoft .com / en-us / magazine / bb985010.aspx

From my experience, there are two most common causes of memory problems:

  • Event handlers - they can hold an object, even if other objects do not refer to it. Therefore, ideally, you need to unsubscribe event handlers to automatically destroy the object.
  • The finalizer stream is blocked by another stream in STA mode. For example, when an STA thread does a lot of work, the rest of the threads stop and objects in the finalization queue cannot be destroyed.
+4
source

Edit: Added link to fragmentation of a heap of large objects .

Edit: since it looks like this is a problem with distributing and removing textures, can you use Texture2D.SetData to reuse the large byte [] s?

First, you need to find out if it is managed or unmanaged memory that is leaking.

  • Use perfmon to find out what happens to your process .. net memory # Bytes in all Heaps and Process\Private Bytes . Compare numbers and memory rises. If the increase in private bytes exceeds the memory growth of the heap, then this is uncontrolled memory growth.

  • Uncontrolled memory growth indicates objects that are not being allocated (but ultimately being collected when their finalizer is executed).

  • If it controls memory growth, we will need to see which generation / LOH (there are also performance counters for each generation of heap bytes).

  • If these are large bytes of a heap of objects, you will want to reconsider the use and discard of large arrays of bytes. Perhaps byte arrays could be reused instead of failing. Also, consider allocating large arrays of bytes with powers of 2. Thus, when placing you will leave a large β€œhole” in the large heap of an object that can be filled with another object of the same size.

  • The last problem is with memory, but I have no advice for you, because I have never come across it.

+3
source

I would also add that if you make any access to the file, make sure that you close and / or dispose of any readers or writers. You should have a corresponding 1-1 between opening any file and closing it.

In addition, I usually use the use clause for resources, for example Sql Connection:

 using (var connection = new SqlConnection()) { // Do sql connection work in here. } 

Do you implement IDisposable for any objects and perhaps do something normal that causes any problems? I would double check all your IDisposable code.

+1
source

GC does not account for an unmanaged heap. If you create many objects, which are just wrappers in C #, in large unmanaged memory, then your memory will be consumed, but the GC cannot make rational decisions based on this, since it sees only a managed heap.

You find yourself in a situation where the GC collector does not think that you are out of memory, because most of the things in your 1st generation heap are 8 byte links, where in reality they look like icebergs in the sea. Most of the memory below!

You can use these GC calls:

 System::GC::AddMemoryPressure(sizeOfField); System::GC::RemoveMemoryPressure(sizeOfField); 

These methods allow the garbage collector to see unmanaged memory (if you provide it with the correct numbers)

0
source

All Articles