Logical operators and branch prediction error

Consider the following cycles:

while((expressionA) & (expressionB)){ // do something } 
 while((expressionA) && (expressionB)){ // do something } 

where expressionA and expressionB are expressions of type bool , and expressionB has no side effects. Under these conditions, two cases are equivalent (on the right?).

A (hypothetical) compiler that would naively take its cue from the source code would put the branch in the && version, and we would end up paying for the branch prediction failures .

With a modern compiler (for example, the current GCC), can there be any conditions under which the & version gives a significant increase in the performance of the && version?


I think not, because:

  • If expressionB is cheap enough, the compiler will recognize this and will not create a short circuit branch.
  • If expressionB is expensive enough, the compiler will create a short circuit because:
    • If the probability of expressionA not close to 1.0, we get a significant average performance gain from a short circuit.
    • if the probability of expressionA is close to 1.0, we will not pay much, because branch prediction will strive for success.
+6
source share

All Articles