I found many implementations of the quicksort algorithm, but in the end I decided to stick with this:
public static void quickSort(int array[], int start, int end) { if(end <= start || start >= end) { } else { int pivot = array[start]; int temp = 0 ; int i = start+1; for(int j = 1; j <= end; j++) { if(pivot > array[j]) { temp = array[j]; array[j] = array[i]; array[i] = temp; i++; } } array[start] = array[i-1]; array[i-1] = pivot; quickSort(array, start, i-2); quickSort(array, i, end); }}
There are a few things that I'm confused about.
Why do some people suggest making the first element as a fulcrum, others say to select the middle element, and some say that you should choose the last element as a fulcrum, wouldn't it be different?
Let's say I'm trying to show why if an array is sorted, quick sort will have O (n ^ 2) as the worst growth order.
I have the following array:
{1, 2, 3, 4, 5, 6}.
If I select the first element as my rotation element, will it not compare it with all other elements, and then just change it to itself and just be O (n)? Then it will continue to two lines, which are O (logn)
quickSort(array, start, i-2); quickSort(array, i, end);
So in the end, even if it's an ordered list of integers, will it still be O (nlogn)?
If I decided to pick up my last element, like my articulated element, would it be completely different? It will replace 6 and 1, and therefore, it will perform operations that are completely different compared to when the control is the first element in the array.
I just don't understand why the worst case is O (n ^ 2).
Any help would be greatly appreciated!
sorting arrays algorithm time-complexity quicksort
Nicky
source share