"System.OutOfMemoryException" was thrown when there was still a lot of free space

This is my code:

int size = 100000000; double sizeInMegabytes = (size * 8.0) / 1024.0 / 1024.0; //762 mb double[] randomNumbers = new double[size]; 

Exception: An exception of type "System.OutOfMemoryException" was thrown.

I have 4 GB of memory on this machine 2.5 GB for free , when I run this launch, there is enough space on the PC to process 762 MB of 100,000,000 random numbers. I need to store as many random numbers as possible, taking into account the available memory. When I go to production, there will be 12 GB on the box, and I want to use it.

Does the CLR hide maximum default memory? and how can i request more?

Update

I thought it could be hacked into smaller pieces, and a phased addition to my memory requirements would help if the problem is related to memory fragmentation , but it doesn’t , I can’t get past the total size of the 256 MB ArrayList array, no matter what I am doing to create a blockSize block .

 private static IRandomGenerator rnd = new MersenneTwister(); private static IDistribution dist = new DiscreteNormalDistribution(1048576); private static List<double> ndRandomNumbers = new List<double>(); private static void AddNDRandomNumbers(int numberOfRandomNumbers) { for (int i = 0; i < numberOfRandomNumbers; i++) { ndRandomNumbers.Add(dist.ICDF(rnd.nextUniform())); } } 

From my main method:

 int blockSize = 1000000; while (true) { try { AddNDRandomNumbers(blockSize); } catch (System.OutOfMemoryException ex) { break; } } double arrayTotalSizeInMegabytes = (ndRandomNumbers.Count * 8.0) / 1024.0 / 1024.0; 
+69
memory-management c # out-of-memory
Jul 20 '09 at 13:50
source share
13 answers

You can read the following: " From memory" does not apply to the physical memory of Eric Lippert.

In short, very simplified, "Out of memory" does not really mean that the amount of available memory is too small. The most common reason is that there is no contiguous portion of memory in the current address space that is large enough to serve the desired allocation. If you have 100 blocks, each 4 MB in size, that will not help you when you need one 5 MB block.

Key points:

  • the data store that we call process memory is, in my opinion, best rendered as an array on disk .
  • RAM can be thought of as simply optimizing performance.
  • The total amount of virtual memory consumed by your program does not really matter much for its performance.
  • "lack of RAM" rarely leads to an "out of memory" error. Instead of an error, this leads to poor performance, because the full cost of the fact that the storage actually on disk suddenly becomes relevant.
+106
Jul 20 '09 at 13:58
source share

You do not have a contiguous block of memory to allocate 762 MB, your memory is fragmented, and the allocator cannot find a large enough hole to allocate the necessary memory.

  • You can try working with / 3GB (as others suggested)
  • Or switch to a 64-bit OS.
  • Or change the algorithm so that it does not need much of the memory. perhaps select a few smaller (relative) blocks of memory.
+20
Jul 20 '09 at 13:59
source share

Make sure you are creating a 64-bit process, not a 32-bit one, which is the standard way to compile Visual Studio. To do this, right-click on your project, Properties → Assembly → target platform: x64. Like any 32-bit process, Visual Studio applications compiled in 32-bit have a 2 GB virtual memory limit.

64-bit processes do not have this limitation, since they use 64-bit pointers, so their theoretical maximum address space (the size of their virtual memory) is 16 exabytes (2 ^ 64). In fact, Windows x64 limits process virtual memory to 8 TB. Then the solution to the memory problem must be compiled in the 64-bit version.

However, the size of objects in Visual Studio is still limited to 2 GB. You can create multiple arrays that are larger than 2 GB, but by default you cannot create arrays larger than 2 GB. Hopefully if you still want to create arrays larger than 2 GB, you can do this by adding the following code to the app.config file:

 <configuration> <runtime> <gcAllowVeryLargeObjects enabled="true" /> </runtime> </configuration> 
+11
Jun 26 '13 at 13:56 on
source share

As you probably found out, the problem is that you are trying to allocate one large contiguous block of memory that does not work due to memory fragmentation. If I needed to do what you are doing, I would do the following:

 int sizeA = 10000, sizeB = 10000; double sizeInMegabytes = (sizeA * sizeB * 8.0) / 1024.0 / 1024.0; //762 mb double[][] randomNumbers = new double[sizeA][]; for (int i = 0; i < randomNumbers.Length; i++) { randomNumbers[i] = new double[sizeB]; } 

Then, to get a specific index, you must use randomNumbers[i / sizeB][i % sizeB] .

Another option, if you always refer to the values ​​in order, may be to use an overloaded constructor to specify the seed. This way you get a semi-random number (e.g. DateTime.Now.Ticks ), store it in a variable, and then when you start going through you have to create a new instance of Random using the original seed:

 private static int randSeed = (int)DateTime.Now.Ticks; //Must stay the same unless you want to get different random numbers. private static Random GetNewRandomIterator() { return new Random(randSeed); } 



It is important to note that while the blog related to Fredrik Mörk's answer indicates that the problem is due to lack of address space, he does not list a number of other problems, such as limiting the size of the CLR object to 2 GB (mentioned in a comment from ShuggyCoUk on the same blog ), ignores memory fragmentation and does not mention the effect of the page file size (and how it can be solved using CreateFileMapping ).

The 2GB limit means that randomNumbers must be less than 2 GB. Since arrays are classes and have some overhead, this means that the double array should be less than 2 ^ 31. I'm not sure how much less than 2 ^ 31 the length should have been, but the overhead of the .NET array? indicates 12 to 16 bytes.

Memory fragmentation is very similar to hard disk fragmentation. You may have 2 GB of address space, but there will be spaces between the values ​​when creating and destroying objects. If these spaces are too small for your large object, and additional space cannot be requested, you will get a System.OutOfMemoryException . For example, if you create 2 million 1024-byte objects, then you are using 1.9 GB. If you delete every object where the address is not a multiple of 3, then you will use .6 GB of memory, but it will be distributed through the address space with 2024 bytes of open blocks between them. If you need to create an object that was .2GB, you would not be able to do this because there was not a large enough block to accommodate it, and additional space could not be obtained (assuming a 32-bit environment). Possible solutions to this problem are things like using small objects, reducing the amount of data stored in memory, or using a memory management algorithm to limit / prevent memory fragmentation. It should be noted that if you are not developing a large program that uses a large amount of memory, this will not be a problem. In addition, this problem can occur in 64-bit systems, since windows are mainly limited by the page file size and the amount of RAM in the system.

Since most programs request working memory from the OS and do not request file mapping, they will be limited by system RAM and page file size. As noted in a comment by Néstor Sánchez on a blog with managed code like C #, you are stuck in limiting the RAM / page file and the address space of the operating system.




It was much longer than expected. Hope this helps someone. I published it because I encountered a System.OutOfMemoryException running an x64 program on a system with 24 GB of RAM, although there was only 2 GB in my array.

+7
Dec 24 '12 at 23:15
source share

I would suggest using the Windows / 3GB boot option. Among other things (this is too difficult to do for one bad application, and this probably will not solve your problem), it can cause great instability.

Many Windows drivers are not tested with this option, so many of them assume that user-mode pointers always point to a lower 2 GB of address space. This means that they can collapse heavily / 3GB.

However, Windows typically limits a 32-bit process to a 2 GB address space. But this does not mean that you should expect that you can allocate 2 GB!

All kinds of distributed data are already inundated in the address space. There are the stack and all loadable assemblies, static variables, etc. There is no guarantee that there will be 800 MB of unallocated memory anywhere.

Allocating 2,400 MB of chunks is likely to be better. Or 4,200 MB pieces. Smaller allocations are much easier to find a place for fragmented memory space.

In any case, if you are going to deploy it on a 12 GB machine anyway, you will want to run it as a 64-bit application that should solve all the problems.

+5
Jul 20 '09 at 14:05
source share

From 32 to 64 bits changed - it’s worth a try if you are on a 64-bit PC and you do not need to transfer it.

+3
Aug 31 2018-12-12T00:
source share

If you need such large structures, perhaps you can use memory files. This article may be useful: http://www.codeproject.com/KB/recipes/MemoryMappedGenericArray.aspx

LP, Dejan

+2
Jul 20 '09 at 14:09
source share

32-bit windows have a 2 GB process memory limit. The / 3GB boot option mentioned above will make it 3 GB with a remainder of 1 GB to use the OS kernel. Actually, if you want to use more than 2 GB without the hassle, you need a 64-bit OS. This also overcomes a problem in which, although you may have 4 GB of physical memory, the address space required for a video card can make the size cartridge of this memory unusable - usually around 500 MB.

+1
Jul 20 '09 at 14:03
source share

Instead of allocating a massive array, can you try using an iterator? They are executed with a delay, that is, values ​​are generated only as they are requested in the foreach statement; you should not run out of memory as follows:

 private static IEnumerable<double> MakeRandomNumbers(int numberOfRandomNumbers) { for (int i = 0; i < numberOfRandomNumbers; i++) { yield return randomGenerator.GetAnotherRandomNumber(); } } ... // Hooray, we won't run out of memory! foreach(var number in MakeRandomNumbers(int.MaxValue)) { Console.WriteLine(number); } 

The above will generate as many random numbers as you want, but only generate them as requested using the foreach statement. This way you will not run out of memory.

Alternatively, if you must have them all in one place, save them in a file, not in memory.

+1
Jul 20 '09 at 17:05
source share

Well, I had a similar problem with a large data set and trying to get the application to use so much data is not a good option. The best advice I can give you is to process your data in a small fragment, if possible. As far as so much data is concerned, the problem will return sooner or later. In addition, you cannot know the configuration of each machine that will run your application, so there is always the risk that an exception will occur on another PC.

0
Jul 20 '09 at 14:05
source share

I had a similar problem, this was due to StringBuilder.ToString ();

0
Mar 30 '17 at 10:57
source share

Convert solution to x64. If you still encounter a problem, specify the maximum length for everything that throws an exception similar to the one below:

  var jsSerializer = new JavaScriptSerializer(); jsSerializer.MaxJsonLength = Int32.MaxValue; 
0
Nov 13 '17 at 13:58 on
source share

Increase the Windows process limit to 3gb. (via boot.ini or Vista boot manager)

-one
Jul 20 '09 at 13:53
source share



All Articles