Redefining a global operator new to track huge memory allocations?

I am trying to create a custom assembly of a large monolithic application. The problem I'm trying to solve is to track the hard-to-reproduce huge memory allocations (30-80 gigabytes, judging by what the OS reports). I believe the problem in std :: vector is changed to a negative 32-bit integer value. The only platform exhibiting this behavior is Solaris (perhaps this is the only platform that successfully distributes such chunks of continuous memory). Can I globally replace std :: vector with my class, delegating all calls to a real vector, observing suspicious allocations (size > 0x7FFFFFFFu) ? Maybe selectively replace the constructor that accepts the methods size_t and resize() ? Maybe even the capture of the global operator is new?

+6
c ++
source share
4 answers

Why not do something like this?

 void *operator new(size_t size) { // if (size > MAX_SIZE) ... return malloc(size); } void *operator new [](size_t size) { // if (size > MAX_SIZE) ... return malloc(size); } 

Setting a breakpoint in if immediately detect the problem.

+5
source share

You can provide a custom allocator on your vector at the time of its creation.

You can simply delegate std::allocator , and the firewall is the requested memory size in the first instance.

+2
source share

Take a look at the implementation of the std::vector class on the problematic platform. Each implementation controls memory management in different ways (for example, some double the current allocated space when you add an object beyond the current size of the vector allocation). If your objects are large enough and / or you have a large number of records added to the vector, you can try to allocate outside the available (continuous) memory on the computer. If so, you need to look into the custom allocator for this vector.

If you store many large elements in a vector, you may want to look at another collection (for example, std::list ) or try to save pointers instead of real objects.

0
source share

You can specify your own allocator type std::vector to track distribution. But I doubt the reason. Firstly, looking at the sizes (30-80 GB), I draw the output of 64-bit code. How does a 32-bit negative integer value make it the size of a vector that is 64-bit, would it be upgraded to 64-bit to save the value? Secondly, if this problem only occurs on Solaris, this may indicate another problem. As far as I remember, Solaris is the only OS that fixes memory allocation, other operating systems only mark the address space allocated until these memory pages are used. Therefore, I would look for unused distributions.

0
source share

All Articles