Sorting Algorithms Flashcards
Quick Sort Time & Space Complexity
Quick Sort (Divide & Conquer)
- Time Complexity
- Best: O(n log n)
- AVG: O(n log n)
- Worst: O(n^2) - when the pivot is always the smallest or largest element
- Space Complexity
- O(log n) - due to the recursion stack
Merge Sort Time & Space Complexity
Merge Sort (Divide & Conquer)
- Time Complexity
- Best: O (n log n)
- AVG: O (n log n)
- Worst: O (n log n) - same for all cases
- Space Complexity
- O(n) - for temporary arrays used during the merge process
Heap Sort Time & Space Complexity
- Time Complexity
- Best: O (n log n)
- AVG: O (n log n)
- Worst: O (n log n)
- Space Complexity
- O(n) for heapq (min-heap, push/pop)
- O (1) (only for manual)
When to use Merge Sort
large datasets
when stability is needed (not OK if equal elements are moved around after sorting)
linked lists
When not to use merge sort
small datasets (overhead isnt worth it. overhead = processing power, memory)
memory constrained environments
Strengths of Merge Sort
Stable sorting (equal objects won’t move around places)
Guaranteed O(n log n) time complexity
good for linked lists
weaknesses of merge sort
O(n) space complexity (requires extra memory)
slower than quicksort on average
when to use quicksort
small to medium datasets
when average case performance matters
in-place sorting desired (modifies input data directly, spatially efficient)
when not to use quicksort
worst-case performance is bad (already sorted data)
when stability is needed (not OK with it moving around values of equal value)
strengths of quick sort
average case O(n log n) time
in place sorting: O(log n) space
weaknesses of quick sort
worst case O(n^2) time complexity
not stable
recursive depth can lead to stack overflow on large datasets
when to use heap sort
when guaranteed O(n log n) performance is needed
memory constrained environments
steaming data (continuous data flow)
when not to use heap sort
for small datasets (often slower than other algorithms)
when stability is required (no moving around same value objects)
strengths of heapsort
guaranteed O(n log n) time complexity
useful for priority queues
weaknesses of heap sort
not stable
slow than quicksort on average
explain merge sort
Merge sort breaks the data up into halves over and over until every data point is isolated down like a tree.
Then it rebuilds up by sorting each node of the tree one step at a time
explain quick sort
Quick sort uses a recurring pivot splitting the dataset into halves over and over before reassembling the sorted halves
explain heap sort
Heap (max heap) method involves creating a heap version of the array.
It extracts the root (max value in the heap) and replaces it with the last element of the real array.
Timsort
used in Python’s built in sort() and sorted() functions. It’s a hybrid sorting algorithm that combines merge sort and insertion sort. Can handle semi sorted and unsorted data.
Is Timesort stable?
Timsort is a stable sort, meaning it preserves the relative order of equal elements
Difference between sort() and sorted()
Sort is a method that’s used on lists (modifies the original list in place)
Sorted is a method that can be used on any iterable object (tuples, lists, strings, etc). It creates a new sorted list without modifying the OG iterable.
Time complexity of Timsort
Best: O(n) - when already sorted (timsort detects this)
AVG Case: O(n log n) - timsort behaves like merge sort
Worst Case: O(n log n)
Space complexity of Timsort
O(n) - requires space for temporary storage during merging phase
- Sorted () is actually O(2n) but it’s O(n) in space complexity representation. Sorted uses more space than Sort( ) as it creates a new object.