Sorts Flashcards

1
Q

Quicksort Time Complexity

A

best, avg O(n log n) (orange)

worst O(n^2) (red)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Quicksort Space Complexity

A

log n (green)

Quick sort calls itself log(n) times. For each recursive call, a stack frame of contsant size is created.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Quicksort - How does it work?

A

How Does It Work?

Choose a pivot. Reorder the array so that all the values less than the pivot value are before the pivot and all the values greater than the pivot value are after the pivot. Now the pivot is in its final position. Apply the same pattern recursively.

Choice of pivot matters greatly. If the final position is chosen as the pivot, performance will be O(n^2) worst case for already sorted arrays or arrays of identical elements.

Computer architecture favors quicksort because it is a cache-efficient sort. It linearly scans the input and linearly partitions the input, which means you make the most of each cache load before reloading.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Quicksort Features

A

Features

In practice, 2-3x faster than mergesort and heapsort despite having the same best case and avg case Big O

In place

Not stable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Why is quicksort 2-3x faster than mergesort and heap sort despite having the same Big O time complexity?

A

Computer archutecture favors quicksort because it is a cache efficient sort. Quicksort processes segments of the array in order. When it loads the first element, the location is loaded into the cache. When it needs to access the next element, that element is likely already in the cache. Other algorithms, like heapsort, jump around the array a lot making them less cache efficient.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Mergesort Natural Variant

A

Can get best case of O(n) using the “natural” variant which exploits naturally occuring sorted sequences. Rather than starting with sublists of size 1 or 2, split the data into naturally occuring runs

3,4,2,1,7

becomes

3,4 2 1,7

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Mergesort Time Complexity

A

Time Complexity

Best, avg, and worst case: n log n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Mergesort - How Does It Work?

A

Divides entire dataset into groups of at most two.

Compares each number one at a time, moving the smallest number to left of the pair.

Once all pairs sorted it then compares left most elements of the two leftmost pairs creating a sorted group of four with the smallest numbers on the left and the largest ones on the right.

This process is repeated until there is only one set.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Mergesort Space Complexity

A

Space Complexity

O(n) worst case

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Mergesort Features

A

Features

Stable (!)

Divide and conquer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What’s the best algorithm for sorting? Questions to ask…

A

What can you tell me about the data?

Is it already sorted, mostly sorted, or unsorted?

How large are the data sets likely to be?

Are there duplicate key values?

How expensive are key comparisons?

What are the requirements?

Are we optimizing for best-case, worst-case, or average-case performance?

Does the sort need to be stable?

What about the system?

Is the largest data set to be sorted smaller than the available memory?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is a stable sort?

A

Preserves the input ordering of elements with equal keys

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Selection Sort Time Complexity

A

Time Complexity

Worst, avg, best case: n^2 (red)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Selection Sort Space Complexity

A

Space complexity

O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Selection Sort - How Does It Work?

A

How does it work?

Loop through array. Find the smallest element then swap it with the element in position 0. Repeat the same process to find the next smallest element and swap it with the item in position 1.

At each iteration, you are selecting the next smallest element and moving it into the sorted porition of the array.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Heap Sort - Time Complexity

A

Time Complexity

Best, avg, worst: n log n (orange)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Heap Sort Space Complexity

A

Space Complexity

Worst: O(1) (green)

18
Q

Heap Sort Features

A

In place

Not stable

Same time complexity as a merge sort but much better space complexity - O(1) vs 0(n).

Same space complexity as a selection sort but much better time complexity - O(n log n) vs O(n^2)

19
Q

Heap Sort - How Does It Work?

A

Very similiar to the seletcion sort. It divides the data into a sorted and unsorted region. Instead of looping through an array to find the next smallest element O(n), it uses a heap data structure O(log n).

20
Q

Counting Sort Time Complexity

A

Time Complexity

Best, avg, and worst case O(n+k) (green) where k is the maxiumum value of the key

21
Q

Counting Sort Space Complexity

A

Space Complexity

O(k) (yellow) where k is the maximum value of the key

22
Q

Counting Sort - How Does It Work?

A

How It Works

Not comparison based

Allocate an array, numCounts, of size k, where the indicies represent the range of the input and the values represent the number of times the number appears.

Make one pass through the array to fill in numCounts.

Create sorted array and fill in the sorted numbers.

23
Q

Sorts with O(n^2) (red) Time Complexity

A

Quicksort: worst case

Selection sort: best, avg, and worst case

Insertion sort: avg and worst case

Bubble sort: avg and worst case

24
Q

Sorts with O(log n) (light green) Space Complexity

25
Sorts with O(n) (yellow) Space Complexity
Mergesort
26
Stable Sorts
Mergesort Counting sort Radix sort Insertion sort
27
Sorts with O(1) (Dark Green) Space Complexity In place sorts
Heapsort Selection sort Insertion sort Bubble sort
28
Sorts with O(n log n) (orange) time complexity
Quicksort: best, avg Mergesort: best, avg, worst Heapsort: best, avg, worst
29
Radix Sort Time Complexity
**Time Complexity** O(nk), where k is the range of values
30
Radix Sort Space Complexity
**Space Complexity** O(n+k), where k is the range of keys
31
Radix Sort - How Does It Work?
**How Does It Work** Not comparison based Groups keys by significant position and value Least significant digit (LSD) radix sort: short keys before long keys then sort keys of the same length lexicographically Most significant digit radix sort: Use lexicographic order. b, ba, c...
32
Insertion Sort Time Complexity
**Time Complexity** Best case: O(n) (yellow) Average case: O(n^2) Worst case: O(n^2)
33
Insertion Sort Space Complexity
**Space Complexity** O(1)
34
Insertion Sort Features
**Features** Stable In place Efficient for small data sets Much less efficient than quicksort, heapsort, and merge sort for large datasets
35
Insertion Sort - How Does It Work?
Loop through the array beginning at item 1. (item 0 becomes the sorted portion of the array.) At each iteration, *insert* the current element into sorted position at the front of the array. (If asked to sort a deck of cards, most people would do it this way.)
36
Bubble Sort Time Complexity
best case O(n) (yellow) avg case, worst case O(n^2) (red)
37
Bubble Sort - How Does It Work?
A comparison based sorting algorithm It iterates left to right comparing every couplet, moving the smaller element to the left. It repeats this process until it no longer moves an element to the left.
38
Bubble Sort Space Complexity
O(1) (dark green)
39
Java Collections.sort()
This implementation is a stable, adaptive, iterative mergesort that requires far fewer than n lg(n) comparisons when the input array is partially sorted, while offering the performance of a traditional mergesort when the input array is randomly ordered. If the input array is nearly sorted, the implementation requires approximately n comparisons. Temporary storage requirements vary from a small constant for nearly sorted input arrays to n/2 object references for randomly ordered input arrays.
40
Adaptive Sort
Takes advantage of presortedness in the input sequence
41
Java Arrays.sort()
Quicksort Implementation Time complexity Best, Avg: n log n Worst: n^2 Space complexity: log n
42
When to use mergesort over quicksort
* MergeSort is stable by design, equal elements keep their original order. * MergeSort is well suited to be implemented parallel (multithreading). * MergeSort uses (about 30%) less comparisons than QuickSort. This is an often overlooked advantage, because a comparison can be quite expensive (e.g. when comparing several fields of database rows).