Sorting Flashcards
What is Quicksorts best, worst and average runtimes? Is it stable and in-place?
Best: n log n
Worst: n²
Average: n log n
Stable: Not stable but can be
In-place: Yes
note:
- stable means to not have the order of 2 equal keys changed i.e. if 5D and 5H were in the initial input array in this order the output array should always hold 5D and 5H one after the other (not 5H, 5D)
- in-place means to output, for example, an array of the same size as the input and thus not wasting memory.
If the list is implemented as an array, this can be done in-place and with at most n comparisons
What is Merge Sorts best, worst and average runtimes? Is it stable and in-place?
Best, worst and avg are all
n log n
Stable: Mostly
In-place: No
In any case the number of comparisons is in Θ(n)
What is Heap Sorts best, worst and average runtimes? Is it stable and in-place?
Best, worst and avg are all
n log n
Stable: No
In-place: Sure?
What is BubbleSorts best, worst and average runtimes? Is it stable and in-place?
Best: n
Worst: n²
Avg: n²
Stable: Yes
In-place: Yes
What is Insertion Sorts best, worst and average runtimes? Is it stable and in-place?
Best: n
Worst: n²
Avg: n²
Stable: Yes
In-place: Yes
- It works well on small lists, or lists that are nearly sorted*
- number of inversions is n(n − 1)/2 in the worst case *(where input is in reverse sorted order)
- 0 in the best case (input is already sorted)*
- about n(n − 1)/4 on average case*
What is Selection Sorts best, worst and average runtimes? Is it stable and in-place?
Best, worst and avg are all
n²
Stable: No
In-place: Yes
In practice, it is not as fast as insertion sort
Shell Sort
Almost nothing is known about average-case running time of Shellsort.
O(n 7/6 ) avg case
Values of h chosen from all relevant numbers of the form 2 i3 j have been proved to give O(n(log n) 2 ) running time
Mergesort recurrence I
C(n) = 2C(n/2) + n
= 2(2C(n/4) + n/2) + n
= 4C(n/4) + 2n = . . .
= 2kC(1) + kn
= n lg n.
Quicksort recurrence
Θ(nlogn)
if 1/n instead of 2/n we get
Θ(nl)
What are some mergesort properties?
Based on a recursive divide-and-conquer approach:
- If the list has 0 or 1 elements, done.
- Divide list into two lists of nearly equal size and recursively sort each half.
- Finally, merge the two sorted halves into one sorted list.
Worst-case running time of Θ(n log n)
One of the best external (disk) sorting algorithms.
Works well for linked lists in addition to arrays/vectors
What are some quicksort properties?
Quicksort is based on a recursive partitioning about a ‘pivot’:
- Pick any element as the pivot (many options here).
- Partition elements into three groups: less than (possibly equal to) pivot, equal to pivot, and more than (possibly equal to) pivot.
- Run quicksort on first and third partitions until one element.
In practice, a very fast sorting algorithm (almost twice as fast as mergesort)
On average good O(n lg n), but worst case input data leads to Ω( n2 ) performance
What are 3 different ways to choose a quicksort pivot?
- Use passive pivot strategy (e.g. fixed position in range)
- Use active pivot strategy (e.g. median-of-three)
- Use random pivot strategy (more likely O(n lg n) performance)
What is beadsort and what is its runtime complexity range?
Sorting algorithm which sorts positive integers by representing them in unary as a sequence of beads on a horizontal lines then uses gravity to sort them with the assistance of vertical rods
The algorithm’s run-time complexity ranges from O(1) to O(S), where S is the sum of the input integers
Quicksort recurrence relations and runtime
Best case: O(n lg n) is when partitioning gives partitions of the same size. Cost is:
T(n) = 2T((n − 1)/2) + n for unique elements
OR
T(n) = 2T(n/3) + n when duplicates allowed
Worst case: O( n2 ) is when partitioning gives unbalanced partitions.
T(n) = T(n − 1) + n