How can an algorithm have two worst complications?

Stephen Skien. The development guide for chapter 1 of the algorithm chapter has this question:

Let P be a problem. The worst temporal complexity of P is O (n ^ 2). The worst time complexity P is also equal to Ω (n log n). Let A be the algorithm that solves P. What is the subset of the following statements in accordance with this complexity information P?

  • A has the worst time complexity of O (n ^ 2).
  • A has the worst time complexity O (n ^ 3/2).
  • A has the worst time complexity O (n).
  • A has the worst time complexity ⍬ (n ^ 2).
  • A has the worst time complexity ⍬ (n ^ 3).

How can an algorithm have two worst time complexity? The author is trying to say that for some value of n (say, 300), the upper bound of the algorithm written to solve P is of order O (n ^ 2), and for another value of n (say, for example, 3000), the worst version of the algorithm was the same Q (n log n)?

+5
source share
1 answer

The answer to your specific question

the author is trying to say that for some value of n (say, 300), the upper bound of the algorithm written to solve P is of order O (n ^ 2), and for another value of n (say, 3000), the same worst-case algorithm was Ω (n log n)?

no . This is not how complexity functions work. :) We are not talking about different complexity classes for different values ​​of n. Difficulty refers to the whole algorithm, and not to the algorithm with certain sizes. The algorithm has one function of time complexity T (n), which calculates how many steps are required to perform the calculation for input size n.

In this task, you are given two pieces of information:

  • The hardest difficulty is O (n ^ 2)

  • The most difficult difficulty is & Omega; (n log n)

All this means that we can choose the constants c1, c2, N1 and N2, such that for our function of the algorithm T (n) we have

  • T (n)? c1 * n ^ 2 for all n & ge; N1

  • T (n)? c2 * n log n for all n & ge; N 2

In other words, our T (n) is “asymptotically bounded below” by some constant time n log n and “asymptotically bounded above” by some constant time n ^ 2. It can be any “between” n log n style function and n ^ 2 style function It can even be n log n (since it is bounded above n ^ 2) or it can be n ^ 2 (since it is bounded below n log n. It can be something intermediate, for example n (log n) ( log p).

It is not so much that the algorithm has “multiple worst-case complexity” in the sense that it has different behavior. What you see is the upper bound and the lower bound! And they can, of course, be different.

Now it’s possible that you have some kind of "weird" function:

 def p(n): if n is even: print n log n stars else: print n*2 stars 

This crazy algorithm has the boundaries specified in the task from the book of Skien. And he does not have & Theta; complexity. It may have been what you were thinking about, but note that there is no need for the complexity function to be so strange that we can say that the upper and lower boundaries are different. It should be remembered that the upper and lower boundaries are not rigid, unless explicitly indicated.

+9
source

All Articles