Sorting Flashcards
how does bubble sort work
- for every iteration, we are bubbling the local largest value to the end
- an optimization can be made by tracking the index last swap is performed and have next iteration end there
- in place
- stable (no long-distance swap)
- adpative (next iteration’s end can be reduced)
- time complexity: O(n^2) worst and average case, O(n) best case
how does insertion sort work
- iterate the array from left to array
- for every iteration start an inner loop that begins from current index all the way to the start
- if the current index is smaller then swap
- this way the left part of the array is made sure sorted while we gradually reduce the size of the right part of the array
- in place
- stable
- adaptive (inner loop stops when no swap is needed)
- time complexity: O(n^2) worst and average case and O(n) best case
how does selection sort work
- loops from end to start of the array
- for every iteration start an inner loop beginning from the current index all the way to the start of the array
- find the maximal value and swap it with the start index of the inner loop
- in place
- unstable (long-range swap)
- not adaptive
- time complexity: O(n^2) in all cases
how does merge sort work
- make use of recursion by splitting the original array into halves until every half only has 1 element, which initiates merging until all the halves are merged into one final sorted array
pseudocode:
mergeSort(arr):
mergeSortHelper(arr)
void mergeSortHelper(arr):
if (arr.length == 1) return
leftArr is left half of arr
rightArr is right half of arr
mergeSortHelper(leftArr)
mergeSortHelper(rightArr)
initialize i, j to be 0, arr.length / 2
initialize k to be 0
loop through leftArr and rightArr concurrently until one of them finishes:
if leftArr[i] < rightArr[i]:
arr[k] = leftArr[i]
i++
k++
else
arr[j] = rightArr[i]
j++
k++
loop through leftArr from i to size:
arr[k] = leftArr[i]
i++
k++
loop through rightArr from j to size:
arr[k] = leftArr[j]
j++
k++
- time complexity: O(nlogn) for all cases
how does quick sort work
- implementation
1. pick a random pivot index
2. swap the first element with the pivot
3. initialize head and tail pointers
4. have the pointers move toward each other while checking inversion (while not crossed, head points to a value greater than pivot and tail points to a value smaller than pivot)
5. if inversion occurs, swap head and tail values and continue the loop
6. when head and tail have crossed, the point of division occurs
7. swap the point od division with the pivot and begin subroutine on the sub-array to the left of the pivot and to the right of the pivot
8. the termination of subroutine happens when the array passed in has fewer than 2 elements - time complexity: O(nlogn) in best case where pivot is always the median value of subarray and in average case; but O(n^2) in worst case where pivot is always the min/max of subarray
how does radix sort work
- initialize an array of linked lists named buckets of size 19 where index 1 - 9 denotes negative digits from -1 to -9, index 0 denotes 0, and index 11-19 denotes positive digits from 1 to 9
- begin a loop that terminates when the largest data runs out of digits
2.1 loop through the input array and put each data into a bucket based on its last digit
2.2 loop through buckets from index 0 to index 19 and put each item in buckets back to the array
time complexity: O(kn) in all cases
how does quick select work
similar to quicksort except that this algo’s purpose is to locate kth smallest element, which can be done by applying quicksort and terminating when the point of division occurs at kth index.
time complexity: O(n) in best case where pivot is always the median value of subarray and average case; O(n^2) in worst case where pivot is always the min/max value of subarray.