Algorithms 1 Flashcards
Linear Search
Desc: Goes through the array element by element until it finds the desired value
Worst Case Runtime: Θ(n)
Best Case Runtime: O(1)
Average Case Runtime: O(n)
Binary Search
Desc: Takes a value in the middle of the array and compares it with the value being searched for. If the value being searched for is less than the comparison value then discard the higher half and repeat process in the lower half. Do the inverse for inverse case
Worst Case runtime: O(Log n)
Best Case runtime: O(1)
Average Case runtime: O(Log n)
Insertion Sort
Desc: Go through each element in the list and check if the one before is smaller. If it is swap them and repeat the process with that one and those before it. Not dynamic, but is stable and in place.
Worst Case runtime: O(n^2)
Best Case runtime: O(n)
Average Case runtime: O(n^2)
Mergesort
Desc: Split an array into two halfs recursively and merge them back together into one array in the correct order. Stable but not in place.
Worst Case runtime: O(n log n)
Best Case runtime: O(n log n)
Average Case runtime: O(n log n)
Divide and Conquer Algorithm
As long as it performs two recursive calls on input sizes of at most n/2 and the conquer operation takes O(n) time then the runtime is:
O(n log n)
Heapsort
Desc: Create a heap (e.g. Binary Tree) using a build function. If the child of the parent is larger than swap the parent with the largest child. Once the swap is done repeat the process on the new child and its children. Continue moving up the tree until the root is the largest element in the tree. To sort completely, swap the root element with the last element, decrease the heap size by one, run the heapify function again to get the largest element at the top and repeat as before. In place but not stable
Worst Case runtime: O(n log n)
Best Case runtime: O(n log n)
Average Case runtime: O(n log n)
Quicksort
Desc: First choose a good pivot point. Then reaarrange the array so that every element that is less than the pivot is on the left of the pivot, and the inverse with elements greater than the pivot. Then sort each side of the pivot recursively. An in place version of mergesort. In place but not stable. Runtime depends on the pivot and subsequent split of the array. Ideal pivot is the Median but finding this is not efficient in practice. Instead we can select a random pivot.
With a random pivot:
Worst Case runtime: O(n^2)
Best Case runtime: O(n log n)
Average Case runtime: O(n log n)
Countingsort
Desc: First count how many of each value appear in the array and store the total in the index of that value in a new array. E.g. Two 3’s would store a number 2 in index 3 of the new array. Then count how many values are less than or equal to each value and store that in the correct index. E.g. if there were two 1’s and one 0 then the value 1 would be stored in index 0 and 3 stored in index 1 of the new array. Finaly go through the new array and add the correct value in the given index -1 and decrement it as you go. E.g. 1 would be put in index 2 in the 3rd list and the number would be reduced to 2 in the 2nd list. Do this for every element in the 2nd list. Stable but not in place
k = Range of input values
Worst Case runtime: O(n + k)
Best Case runtime: O(n + k)
Average Case runtime: O(n + k)
Radixsort
Desc: Uses a stable sorting algorithm to sort n d-digit numbers where each digit can take on up to b possible values. E.g. b = 2, d = 5 e.g. 01101. Sorts according to the current digit.
Sorts numbers in runtime O(d(n+b)) time if the stable sort uses O(n+b) time
Big Omega Notation
Ω(g(n)) = 0 <= cg(n) <= f(n)
For all n >= n0
Big O notation
O(g(n)) = 0 <= f(n) <= cg(n)
For all n >= n0
Big Theta Notation
Θ(g(n)) =
0 <= c(1)g(n) <= f(n) <= c(2)g(n)
For all n >= n0
f ∈ Θ(g) is equivalent to g ∈ Θ(f )
True
f ∈ Θ(g) is equivalent to f ∈ O(g) and g ∈ O(f )
True
f ∈ Ω(g) is equivalent to g ∈ O(f )
True
f , g ∈ O(h) then: f + g ∈ O(h)
True
f ∈ O(h1), g ∈ O(h2) then f + g ∈ O(h1 + h2)
True
f ∈ O(h1), g ∈ O(h2) then f · g ∈ O(h1 · h2)
True