In response to this question , debate began in the comments on the complexity of QuickSort. What I remember from my university time is that QuickSort is O(n^2) in the worst case, O(n log(n)) in the middle case, and O(n log(n)) (but with more tight binding) at best.
I need a correct mathematical explanation of the average complexity value to clearly explain what this means for those who think that a large O note can only be used for the worst case.
What I remember, if you determine the average complexity, you should consider the complexity of the algorithm for all possible inputs, calculate how many degenerate and normal cases. If the number of degenerate cases divided by n tends to 0 when n becomes large, then you can talk about the average complexity of the general function for normal cases.
Is this definition correct or is this definition of medium complexity different? And if that's right, can someone say it more strictly than me?
algorithm complexity-theory big-o
kriss Oct 11 '10 at 10:30 2010-10-11 10:30
source share