Fragmentation of a heap of a large object, problems with arrays

I am writing a C # analysis application that has to deal with large memory. I am using ANTS Memory Profiler 7.4 to optimize memory management. At the same time, I realized that all of my double [,] arrays that I use (and I need them) are placed in LOH, although the largest of these arrays is about 24,000 bytes. As far as I know, objects should not be up to 85,000 bytes. The problem is that since I have about several thousand instances of these double [,] arrays, I have a lot of memory fragmentation (about 25% of the total memory usage is free memory, which I cannot use). some of these arrays stored on the LOH are only 1,036 bytes in size. The problem is that sometimes I have to do more analysis and then I get a memory exception due to huge memory loss due to LOH fragmentation.

Does anyone know why this happens, although by definition it should not be a large object?

Memroy snapshot of my application using ANTS Memory Profiler

small double arrays are also affected by this (only 70 elements in the array)

+7
source share
1 answer

The threshold size for placing twin arrays on LOH is much lower than for other types. The reason for this is that elements on the LOH are always 64 bit aligned, and doubles the benefits of 64 bit alignment.

Please note that this only affects programs running in 32 bits. Programs running on 64 bits have objects that always align at the 64-bit boundary, so the LOH heuristic is not used for 64-bit programs.

Threshold size - 1000 doubles.

Also see https://connect.microsoft.com/VisualStudio/feedback/details/266330/

+4
source

All Articles