The answer to your specific question
the author is trying to say that for some value of n (say, 300), the upper bound of the algorithm written to solve P is of order O (n ^ 2), and for another value of n (say, 3000), the same worst-case algorithm was Ω (n log n)?
no . This is not how complexity functions work. :) We are not talking about different complexity classes for different values of n. Difficulty refers to the whole algorithm, and not to the algorithm with certain sizes. The algorithm has one function of time complexity T (n), which calculates how many steps are required to perform the calculation for input size n.
In this task, you are given two pieces of information:
All this means that we can choose the constants c1, c2, N1 and N2, such that for our function of the algorithm T (n) we have
In other words, our T (n) is “asymptotically bounded below” by some constant time n log n and “asymptotically bounded above” by some constant time n ^ 2. It can be any “between” n log n style function and n ^ 2 style function It can even be n log n (since it is bounded above n ^ 2) or it can be n ^ 2 (since it is bounded below n log n. It can be something intermediate, for example n (log n) ( log p).
It is not so much that the algorithm has “multiple worst-case complexity” in the sense that it has different behavior. What you see is the upper bound and the lower bound! And they can, of course, be different.
Now it’s possible that you have some kind of "weird" function:
def p(n): if n is even: print n log n stars else: print n*2 stars
This crazy algorithm has the boundaries specified in the task from the book of Skien. And he does not have & Theta; complexity. It may have been what you were thinking about, but note that there is no need for the complexity function to be so strange that we can say that the upper and lower boundaries are different. It should be remembered that the upper and lower boundaries are not rigid, unless explicitly indicated.