Algorithms Flashcards
Define Big O notation
f(x) = O(g(x)) means: there exists: * some positive constant M * some minimum x-value, x_o Such that for all x > x_o * f(x) <= M×g(x)
Basically: f(x) = O(g(x)) means
“f(x) is at least as fast as g(x)”
Define the Limit Theorem
Let f(x) & g(x) be real-valued functions
Let L = lim (f(x)/g(x)) lim of x->infinity
- If L = 0, f(x)=o(g(x)) (and thus Big O too)
- If L = infinity, g(x)=o(f(x)) (and thus Big O too)
- If 0 < L < infinity, f(x)=O(g(x)) and g(x)=O(f(x))
Define L’Hopital’s rule
Condition: * if lim f(x) and lim g(x) = 0 OR * if lim f(x) and lim g(x) = infinity Then: lim (f(x)/g(x)) = lim (f'(x)/g'(x)) the derivative of each
Define Little O notation
f(x) = o(g(x)) means
* for EVERY positive ε
There exists a constant x_ε
Such that: f(x) <= ε*g(x) for all x>= x_ε
Basically, f(x) = o(g(x)) means:
g(x) grows at a much faster rate than f(x)
If f(x) = o(g(x)), then f(x) = O(g(x)) too
What is Big Omega (Ω)
Opposite of Big O
f(x) = Ω(g(x)) if g(x) = O(f(x))
What is Big Theta (Θ)
If two functions are Big O of eachother
f(x) = Θ(g(x)) if f(x) = O(g(x)) and g(x) = O(f(x))
What is a recurrence Relation
Is a mathematical equation
Is Not a code/algorithm.
Recursively defines a function’s values in terms of earlier values
How do you prove the runtime of a function?
- Write out its recurrence relation
- Turn it into closed form (by expanding the recurrences & finding a pattern)
- Do a formal proof to show the recurrence relation is equal to the closed form
What is the Master Method
T(n) = aT(n/b) + O(n^d)
a, b, d must be constants
- if log_b(a) < d, then T(n) = O(n^d)
- if log_b(a) = d, then T(n) = O(n^d * log(n))
- if log_b(a) > d, then T(n) = O(n^(log_b(a)))
Binary Search FIX
O(log(n))
BubbleSort runtime?
Worst-case runtime: O(n^2)
* if array is in reverse order
Best-case runtime: O(n)
* if array is already sorted
How does BubbleSort work?
- Iterate through the list
- At each step, compare element i to i+1
- If they’re out of order, swap them
- Continue down the list & start over until no swaps are needed
InsertionSort runtime?
Worst-case: O(n^2)
Best-case: if already sorted
O(n) if backwards search,
O(nlog(n)) if binary search
* if array is already sorted
How does InsertionSort work?
- Iterate through array
- At each element, figure out where to place it in the elements before it (already sorted)
- Shift appropriate elements
- Move to next element
SelectionSort runtime?
O(n^2)
For ALL situations