Do “if” expressions affect time complexity analysis?

According to my analysis, the running time of this algorithm should be N 2 because each of the cycles passes once through all the elements. I'm not sure if the presence of the if changes the time complexity?

 for(int i=0; i<N; i++){ for(int j=1; j<N; j++){ System.out.println("Yayyyy"); if(i<=j){ System.out.println("Yayyy not"); } } } 
+7
source share
6 answers
  • Tp: The time it takes to print constant text to standard output.
  • Ti: time required for all other operations within the inner loop (predicate estimation, etc.).
  • To: time that is required for all operations inside the outer loop, except for the inner loop (initialization of counters, etc.).
  • Tc: time required to set up the process and any other accounting

The total runtime will be Tc + N x (To + NxTi + N / 2xTp).

This is equal to Tc + NxTo + (Nx (N / 2)) x (2Ti + Tp), which is bounded by K x (N ^ 2) for the values ​​K> Ti + Tp / 2 when N goes to infinity. This boundary makes the time complexity still O (N ^ 2).

No, the if statement does not change the time complexity in this example.

+6
source

Not. Consider that time complexity describes time asymptotically - we absorb lower time complexity into higher.

O(n²) means k × n² + c , where he suggested that c so low that we don't care.

This is a permanent effect, a certain amount of overhead for all this (c) and a certain amount of costs for any of our complexity. One algorithm will beat another algorithm if it has lower k , or if they are equal, lower c . In addition, O (n²) coincides with O (1) at sufficiently low values ​​of n (and we usually don’t care, but each of k can be massive, as well as if we execute each m times then O (n²m ) exceeds O (m) if n is small, and not what was really compared).

In any case, this is a deliberate over-simplification, because k and c may not be constant, as good as constants. Therefore, if something really was O(n² + log(n)) , we would call it O(n²) because someone cares about this little log(n) when we worry about ?

So. Looking at your business. We do an outer loop, n times. For each of them, we do the inner loop n-1 times. For each of these internal cycles, we do the first print (any difference in cost is not related to n any way, therefore it is essentially constant) and a test. The test passes in about half the cases, which often leads to the cost of a second print. A.

Thus, the total cost:

 cost of setting up everything + n × cost of looping + (n - 1) × cost of loop + (n × (n - 1)) × cost of print + (n × (n - 1)) × cost of test + (n × (n - 1)) / 2 × cost of print. 

Assigning the values ​​to the above constants we get:

 k + n × a + (n - 1) × b + (n × (n - 1)) × c + (n × (n - 1)) × d + (n × (n - 1)) / 2 × e. = k + n × a + (n - 1) × b + (n × (n - 1)) × (c + d + (e / 2)) 

Now, since c + d + e / 2 itself is a constant, this could become:

 n × a + (n - 1) × b + (n × (n - 1)) × c + k 

Or reorder first in highest order:

 (n × (n - 1)) × c + (n - 1) × b + n × a + k 

If n is big enough for us to be damn, then n is proportionally so close to n - 1 that we could also consider them the same (another aspect of time complexity that describes things asymptotically, that is, as n approaches ∞ and therefore , the difference between n² and (n × (n - 1)) approaches 0). Consequently:

 n² × c + n × b + n × a = n² × c + n × (b + a) + k 

Again, b + a is a constant, so it is equivalent to:

 n² × k + n × c + a 

And now we are doing what was mentioned earlier about absorbing lower orders of time, who cares about n × c , doesn’t it matter? If n is high enough for us to care in general, then it's all about . For comparison, we could simply consider the difference with the total overhead as noise and consider it as:

 n² × k + c 

Or, in other words, like:

 O(n²) 

So, yes, in fact you came across it, and the if statement does not affect complexity.

Given this, we can note that for temporary complexity, you can hide what really excites us. If, for example, we had the O (n²) algorithm, then such an analysis showed that the time value of n² × k + n × c was 200 μs and c was 15 s, then until n exceeds 750,000, actually the per-n bit is what it costs, not the on-n² bit. With a lower n, this comes closer to what we expect from O (n) than from O (n²).

The reason for the time complexity is utility, the combination of so much inconsistency is rare, and that we care about time and, therefore, time complexity, more when n gets higher (you could hide some terrible O (n!) Monster in your code, which you call once in the blue moon, with three elements and just don’t care). Therefore, in order to achieve real improvements, we ideally want to reduce the time complexity or failure to reduce the height of the constant k (or, in other words, if you can start doing nnnn times instead of n² times, then, otherwise reduce the cost of this thing you do n² times, not something else that you do n times).

In other words, it helps us focus on what is usually a mattress.

+3
source

Of course, if you are working with a comparison algorithm, is that what you think is right?

so in your case you are looking at O ​​(n 2 ) because your if statement is executed almost n 2 times.

For algorithms without comparison, you consider what your main operation is.

0
source

No, if does not affect complexity.

0
source

The complexity of time is the same, you will print Yayyyy N ^ 2 times.

0
source

Short answer: Runtime is still O (N ^ 2).

Long answer: the cost of conditional checking can be "added" to the weight of the println operation, just as if you had two println () instead of one println (). (As an example, keep in mind that the cost of I / O, such as println, far exceeds simple integer comparisons.)

Perhaps you could say that calling println () costs "1 operation" and the comparison is "operation 0.0001", so the total cost will be "(1.0001 * N) ^ 2 operations, not just N ^ 2. You also reduced the number of println () in half so that we can say that you are in (1.0001 * N) ^ 2/2 operations. It is still O (N ^ 2), although you can shorten this value for a given value from N double the execution time by only printing half the records.

In general, the cost of comparison should be added to the cost of operations in the industry (sectors) resulting from this comparison. When both if () {} and else {} are present, it is more difficult to measure the runtime. The tactic here is to evaluate the lead time, as if the most expensive operation was performed each time; or, if the execution time of one branch is not easily recognizable, evaluate as if both operations were performed with each iteration of the loop. If these operations are like O (1), the execution order at runtime remains O (N ^ 2), since you scale linearly.

0
source

All Articles