Divide and Conquer Flashcards
Q: If a recursive algorithm divides its input size by 2 at each step, follows one path, and does constant work, what’s the runtime?
A: O(log n) - Example: Binary Search
Master Theorem:
T(n) = aT(n/b) + O(n^d)
T(n) =1T(n/2) + O(n^0) # Each level is O(1)
b^d (1) = a (1), therefore O(n^d log n) using MT
Simplifies to O(log n)
Q: If a recursive algorithm looks at BOTH halves of the input and does constant work per level, what’s the runtime?
A: O(n) - Example: Binary Tree Traversal
Master Theorem:
T(n) = aT(n/b) + O(n^d)
T(n) = 2T(n/2) + O(n^0)
- subproblems: 2
- scaling: 2
- work at each level: O(1)
b^d (1) < a (2), therefore O(n^{log_b a}) using MT
Plug in: O(n^{log_2 2})
Simplifies to O(n^1) to O(n)
Q: If a recursive algorithm divides input by 4, looks at all quarters, and does O(n^2) work per level, what’s the runtime?
A: O(n^2)
Q: If a recursive algorithm divides input by 2, processes both halves, and does O(n*log(n)) work per level, what’s the runtime?
A: O(n*log^2(n))
Q: If a recursive algorithm divides input by 3, processes all thirds, and does O(sqrt(n)) work per level, what’s the runtime?
A: O(n^{log_3(3)}) = O(n)
Q: If a recursive algorithm divides input by 2, processes all halves, and does O(n^2) work per level, what’s the runtime?
A: O(n^2*log(n))
Q: If a recursive algorithm divides input into 3 equal parts, processes all parts, and does O(1) work per level, what’s the runtime?
A: O(n^{log_3(3)}) = O(n)
Q: If a recursive algorithm divides input by 2, looks at BOTH halves, and does O(n) work per level, what’s the runtime?
A: O(n log n) - Example: Mergesort
Q: When you see T(n) = 1T(n/2) + O(n^0), what runtime should you expect?
A: O(log n)
Why? Dividing by 2 each time with constant work per level
Master theorem:
b^d = 2^0 = 1 = a
if b^d = a, T(n) = O(n^d log n)
(simplifies to O(log n) since d is zero)
Q: When you see T(n) = 2T(n/2) + O(n), what runtime should you expect?
A: O(n log n)
Why? Linear work at each level, with log n levels
Master theorem:
b^d = 2 = a, so
T(n) = O(n log n)
Q: When you see T(n) = 2T(n/2) + O(1), what runtime should you expect?
A: O(n)
Why? Even though dividing by 2, visiting both halves means visiting all n elements once
Master’s Theorem:
b^d = 2^0 = 1
2 (a) > 1 (b^d)
if a > b^d: O(n^{log_b a})
T(n) = O(n^{log_2 2})
= O(n^1)
= O(n)
Q: If a recursive algorithm divides input into 3 parts and does constant work, what’s the runtime?
A: O(log_3 n)
Why? Number of times you can divide n by 3 until reaching 1
Given O(3^(log_2 n)) , solve for n
(Hint: just need to swap 2 things)
O(n^(log_2 3))
Idea: Switch 3 and n
[FastSelect] Explain why select v in list A between 25th and 75th percentile will ensure sublists A_L and A_R to have size at worst 3/4 of A
- A number at the 75th percentile has 75% of elements less than or equal to it, and 25% greater. A_L has at most 75% of elements
- A number at the 25th percentile has 25% of elements less than or equal to it, and 75% greater A_R has at most 75% of elements.
- Therefore, no matter where v falls in this range, neither S_L nor S_R can be larger than 75% of the original array
[Masters Theorem]
What does n^{log_b a} represent in a subproblem of size n/b?
Width of the last level (base case of 1)