Big O Flashcards
State the master method and define each of the variables.
T(n) <= aT(n/b) + O(n^d)
a = Number of recursive calls
b = How much the problem set is reduced in each call
d = Represents amount of work in each call, 0 is constant time, 1 is linear, 2 would be quadratic, etc
What does master method from a = b^d represent? What is it’s running time?
Total amount of work is same at each sub-level (but spread out among more sub-problems).
O(n^d * log(n))
What does master method form a < b^d represent?
Total amount of work decreases at each sub-level of recursion tree, so most work is at the root level.
O(n^d) which is the amount of work done at the root level
What does master method form a > b^d represent? What is it’s running time?
Total amount of work is increasing at each sub-level of recursion tree, so the most work is done where there are the most # of leaves in the tree, or the bottom of it.
O(n^(logb(a))) which is also the number of leaves
What is a starting point (usually ok estimate) for recursive run times?
O(branches^depth)
What are the two main classifications of run time?
Polynomial and super-polynomial?
What is a polynomial runtime? How feasible is it? Give a few examples.
Anything where the variable isn’t in the exponent. Can be very feasible lgn, n, 2n, or a bit slow but doable if really need to n^2, n ^3.
What is a super-polynomial runtime? How feasible is it? Give a few examples.
Anything where the variable is raised/a power. Basically impossible on an large problem set, traveling salesman problem. 2^n. 3^n
How do Big O and tilde notation differ?
Tilde keeps the coefficient of the dominant term.