How to find out the correct maximum size of a vector? max_size ()? but no

When using the vector "From memory" is displayed.
To fix this, I use max_size () to check, then backup or push_back.
If max_size () is greater than the reserved value, this should be fine, but it is not! Then what is the value of max_size ()?
I am compiling the below demo in Windows 7 and Visual Studio 2010. My computer has 4 GB of RAM. When reseverd is 1/2 of max_size (), it fails.

max_size()=2^32/sizeof(CPoint3D)-1=268435455 

This is normal when 1/4 of max_size () in the demo. In my real project, this is normal to 1/10.

What is the right maximum size of a vector, how to increase it ?


I got a "out of memory" error when I push a lot of elements in std::vector . To avoid the error, I first checked with vector::max_size() and used vector::reserve() to preallocate memory. However, this does not work. In a demo project, an error occurs when I reserve 1/4 of max_size . In a real project, an error occurs when I reserve 1/10. I am running Windows 7 and compiling with Visual Studio 2010. My computer has 4 GB of RAM.

If max_size does not work, how can I find out the maximum number of elements that I can allocate for vector ?

+2
source share
4 answers

max_size() tells you the design limit for the class, but running out of memory can limit the actual size to something less. Usually there is no way to find what might be the lower limit (for example, it can change from one moment to another, depending on how much memory is used by other programs).

+4
source

max_size() returns the maximum number of elements that the vector can hold. That is, the absolute limit when taking into account such things as addressing restrictions, using the built-in types that it can store, and address space restrictions for the operating system.

This does not mean that you can really make it vector to have a lot of elements. It just means that you can never store more. Also, just because you have 4 gigabytes of RAM, this does not mean that you can create a single continuous buffer that occupies 4 gigabytes of RAM or anywhere. There are other factors to consider as memory fragmentation (because of it, you can only output one block of memory into memory).

If you really need a lot of elements in a container, an adjacent sequence is probably not a good choice. For large datasets, you may need something that can be dumped into bits and pieces, such as std :: deque.

+4
source

vector::capacity() gives the maximum number of elements that can be stored in a vector without redistribution, which may potentially not work from std::bad_alloc .

vector::max_size() has a different value, roughly the same (INT_MAX / sizeof(element)) .

For more information about managing Windows memory, see the MSDN article.

+3
source

The problem is that the vector is trying to allocate a contiguous block of memory that may not be available at that time, although the total available memory may be much larger.

I would suggest using std::deque , since it does not require a contiguous block of memory.

+2
source

All Articles