Bad Alloc with 200 GB C ++ memory

I am new to C ++, and I am studying "compression perception", so I need to work with huge matrices, and MATLAB is actually slow, so I programmed my algorithm in C ++.

The fact is that I store large arrays (about 100 MB-1 GB). They make up 20 arrays of approx. and it works great with 30 GB of memory, however, when a process requires more than 40 GB, it just stops. I think this is a memory problem, I tested it on Linux and Windows (64-bit 64-bit processors 64 bit MinGW-200Gb Ram-intel Xeon) are there any restrictions ?.

size_t tm=n*m*l; double *x=new double[tm]; 

I use about 20 arrays like this. n, m ~ = 1000 and L ~ = 30, which are usually sizes.

thanks

+7
c ++ mingw-w64 dynamic-arrays bad-alloc
source share
1 answer

20 arrays, a problem with a total memory capacity of 40 GB - this indicates that the program crashes when the array exceeds 2 GB. This should not be, the 64-bit address space should use 64 bits of size_t for the size of the object. It seems that MinGW is misusing the size of 31 bits (i.e., Sign Loss).

I do not know how you allocate memory, but this, perhaps, can be eliminated bypassing the broken allocation procedure and go directly to the OS allocator. For example. for Windows you can call VirtualAlloc (skip HeapAlloc, it is not designed for such large distributions).

+3
source share

All Articles