How large are the attributes of a class object? How to determine the stack / heap limit?

I have a class that requires a large amount of memory.

class BigClass { public: BigClass() { bf1[96000000-1] = 1; } double bf1[96000000]; }; 

I can only initiate the class with a "new" object in the memory heap.

 BigClass *c = new BigClass(); assert( c->bf1[96000000-1] == 1 ); delete c; 

If I initiate it without the "new". I get a segmentation error at runtime.

 BigClass c; // SIGSEGV! 

How to determine the memory limit? or is it better for me to always use the "new"?

+4
source share
6 answers

The stack has a fixed size, which depends on the compiler options. See your compiler's documentation for resizing the stack for your executable.

In any case, for larger objects, prefer to use new or better ones: smart pointers such as shared_pointer (from boost or from std :: tr1 or std :: if you have a very recent compiler).

+2
source

First of all, since you called this C ++, not C, why are you using arrays? Instead, I can suggest vector<double> or, if continuous memory causes deque<double> problems, which loosens the limit on continuous memory without removing the almost constant time lookup.

Using vector or deque can also alleviate other seg crash issues that may affect your project at a later date. For example, the excess limits in your array. If you convert to vector or deque , you can use the member function .at(x) to retrieve and set the values ​​in your collection. If you try to write out of bounds, this function will throw an error.

+3
source

There is no platform independent way to determine the memory limit. For "large" amounts of memory, you are much safer to allocate a bunch (i.e. using new ); you can verify success by comparing the resulting pointer to NULL or catch the std::bad_alloc exception.

+1
source

You must not play this game. Your code can be called from another function or in a thread with a lower stack size limit, and then your code will be destroyed. See this close question .

If you are in doubt about using heap allocation ( new ) - either directly using smart pointers (e.g. auto_ptr ) or indirectly using std::vector .

+1
source

As your class is designed, as you have found, is quite fragile. Instead of always allocating your objects on the heap, instead, your class should allocate a huge block of memory on the heap, preferably with std::vector or perhaps with shared_ptr if vector does not work for any reason. Then you do not need to worry about how your customers use the object, safely put on the stack or heap.

+1
source

On Linux, in the Bash shell, you can check the stack size with ulimit -s . Variables with automatic storage time will be allocated on the stack. As others have said, there are better ways to approach this:

  • Use std::vector to store your data inside BigClass .
  • bf1 memory for bf1 inside the BigClass constructor, and then free it in the destructor.
  • If you must have a large double[] member, select the BigClass instance with a smart pointer ; if you don't need sharing, as simple as std::auto_ptr will allow you to safely build / destroy your object:

     std::auto_ptr<BigClass>(new BigClass) myBigClass; myBigClass->bf1; // your array 
0
source

All Articles