Creating an object in a loop

std::vector<double> C(4); for(int i = 0; i < 1000;++i) for(int j = 0; j < 2000; ++j) { C[0] = 1.0; C[1] = 1.0; C[2] = 1.0; C[3] = 1.0; } 

much faster

  for(int i = 0; i < 1000;++i) for(int j = 0; j < 2000; ++j) { std::vector<double> C(4); C[0] = 1.0; C[1] = 1.0; C[2] = 1.0; C[3] = 1.0; } 

I understand that this is happening because std::vector repeatedly created and created in a loop, but I was impressed that it would be optimized.

Is it completely wrong to support variables locally in a loop when possible? I was under the (possibly false) impression that this would provide optimization opportunities for the compiler.

Or maybe this only applies to POD types, not std::vector .

EDIT: I used VC ++ 2005 (release mode) with full optimization ( /Ox ) in Windows XP

+4
c ++ optimization
source share
7 answers

Is it wrong to keep variables local in a loop when possible? I was under the (possibly false) impression that this would provide optimization opportunities for the compiler.

No, this is a good rule of thumb. But this is just a rule of thumb. Minimizing the size of the variable gives the compiler more freedom to allocate registers and other optimizations, and at least as importantly, as a rule, gives more readable code. But it also depends on whether re-creation / destruction is cheap or fully optimized. This often happens ... But not always.

So, as you have discovered, sometimes this is a bad idea.

+3
source share

The problem is working with a bunch. Replace std::vector<double> C(4); on std::array<double, 4> C; , and it should not make any difference when you put this variable anymore.

+2
source share

I was under the (possibly false) impression that this would provide optimization opportunities for the compiler.

This is probably true for built-in types such as int or double.

The problem is that you are using a vector that the constructor should run when you enter the body of the loop, and the destructor when you exit. Since both of these methods are nontrivial, the compiler cannot optimize them, since your program will no longer be correct.

As an example for comparison, imagine what such an optimization would do if you used a file object instead of a vector.

+2
source share

The second way is to allocate a new memory (in your case 1000 * 2000 times). Each of them is a completely new memory cell on the heap (although not always new, it can be in one place). Allocating memory takes longer than just changing the values ​​contained in an already allocated memory.

The first way is to allocate 1 memory array and simply change the values ​​in it. If the compilers optimize for this (which is not always the case), it is better not to leave it in the compiler if you can choose less memory (or less) yourself as a programmer.

+1
source share

Creating a vector is expensive, in this case, since it can allocate an array of size 4 on the heap.

If you know the size of the "local" advance vector, you can also use an automatic array:

 for( int i = 0; i != 2000; ++i ) { int C[4]; // no initialization C[0] = 1; // ... } 

Thus, you lose the cost of allocating free memory.

+1
source share

The obj volume will be inside the loop, so you cannot use it once the loop is finished. This, in addition, to the object that receives the instance, and then destroyed as many times as the loop goes through. In the end, nothing happens, except that time is wasted and then destroys the object.

0
source share

The first thing you need to do is make sure the design is in order, this means:

  • Is the code easy to understand?
  • Does the code protect against errors
  • Is the code easily extensible

I think that in this case it would really mean that it is better to define a variable in a loop.

Only if you have real performance problems can you optimize your code (if the compiler hasn't done it for you yet), for example. putting a variable declaration outside the loop.

0
source share

All Articles