Ch 8: Recursion and Dynamic Programming Flashcards
Recursion and Dynamic Programming:
Important Concepts
- When Recursion applies
- Common Recursive Structures and Runtimes
- Call Trees
- Approaches
- Bottom-Up
- Top-Down
- Half and Half
- Recursive vs Iterative tradeoffs
- Dynamic Programming
- Memoization
- Fibonacci Number Generation
Signs that a problem has a recursive solution
- It can be built off of subproblems
- Split data set multiple times
- Use (n-1)th solution to get nth solution, etc
- Problem Statements:
- “Compute the nth ….”
- “List the first n…”
- “Compute all…”
Common Recursive Problem Structures
- Solve f(n) by using f(n-1)
- Solve problem for first half of data set, then the second half, then merge those results
Three Common Approaches to Developing a Recursive Algorithm
- Bottom-Up Approach
- Top-Down Approach
- Half and Half Approach
Recursive Algorithms:
Bottom-Up Approach
- Often the most intuitive
- Start with knowing how to solve in a simple case
- eg: sort a list with one element
- Figure out how to solve for 2 elements, then 3, etc
- Build the solution for one case off of previous cases
Recursive Algorithms:
Top-Down Approach
- Determine how to divide the problem for case N into subproblems
- Be careful of overlap between the cases
Recursive Algorithms:
Half and Half Approach
- Divide the data set in half
Examples: - Binary Search
- Merge Sort
Recursive vs Iterative Solutions
- Recursive Algorithms can be very space inefficient
- Each recursive call adds a layer to the stack
- recursive depth n -> uses at least O(n) memory
- All recursive algorithms CAN be implemented iteratively
- This is often more complex, though
Dynamic Programming:
Basic Idea
- Take a recursive algorithm
- Find the overlapping subproblems
- Subproblems that would be solved over and over with normal recursion
- Cache the results of subproblems for use in future recursive calls
- Can also study the pattern of recursive calls and implement something iterative, still caching previous work
Dynamic Programming
vs
Memoization
- Memoization is “Top-Down” Dynamic Programming
- Some people only use “Dynamic Programming” to refer to “Bottom-Up” work, with Memoization as a distinct concept
- This book refers to both as Dynamic Programming
Dynamic Programming:
Simplest Example
Compute the nth Fibonacci number
Dynamic Programming:
Basic Strategy
- Implement a normal recursive solution
- Identify if overlapping problems exist
- Modify solution to add caching
Fibonacci Numbers(Sequence)
A series of numbers, starting with 0 and 1.
Each Fibonacci Number is the sum of the two previous Fibonacci Numbers
F(0) = 0, F(1) = 1
F(n) = F(n-1) + F(n-2) for n > 1
This is one of the most common and simplest examples of Dynamic Programming
Fibonacci Sequence:
Basic Recursive Solution
and
Runtime
int fibonacci(int i) {
if (i == 0) return 0;
if (i == 1) return 1;
return fibonacci(i -1) + fibonacci(i - 2);
}
The runtime may appear at first glance to be O(n) or O(n^2).
Look at the call tree.
* The total number of nodes in the tree represents the runtime, each only does O(1) work outside of it’s recursive calls.
* All of the leaves on the tree are fib(1) and fib(0) (The base cases).
* Every other node has 2 children
* doing this n times gives roughly O(2^n) nodes/runtime
- Actually closer to O(1.6^n), because right subtrees are always one depth smaller. Still an exponential runtime, though.
Fibonacci Sequence:
Memoization Solution and Runtime
Start at n, recurse down to base cases and cache solutions.
int fibonacci(int n) {
return fibonacci(n, new int[n + 1]);
}
int fibonacci(int i, int[] memo) {
if (i == 0 I I i == 1) return i;
if (memo[i] == 0) {
memo[i] = fibonacci(i - 1, memo)+ fibonacci(i - 2, memo);
}
return memo[i];
}
Call Tree:
* There is only one branch that goes all the way down to fib(1) and fib(0)
* Other branches reference the cached earlier calls for O(1) call time
* There are roughly 2n nodes, for an O(n) runtime