Recursion & DP Flashcards

1
Q

Guidelines for recursion function

A
  1. Find the base case
  2. Define the recurrence relation (what keeps happening)

Useful tips:
- Just try to solve one base case and let the recursion do the magic. Stepping into the call flow usually ends up causing confusion

  • Always try to make a tree diagram for recursive method calls being made to determine the complexity (which would be space mostly, since the calls are stored on a stack). Each node of the tree can represent the parameters being passed to the recursive function
  • There is always an iterative solution for a recursive function. It’d just be a little difficult
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is memoization?

A

By caching and reusing the intermediate results, memoization can greatly reduce the number of recursion calls. Like creating a hashmap or array to store the frequently used values.

  • Memoization is often referred as the top-down dynamic programming (note: the bottom-up is just being referred as dynamic programming)
  • Like for recursive, it’s almost always a good idea to draw a tree with the cached state in a box, to determine the runtime complexity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Master Theorem for time complexity

A

Cannot be used in scenarios where the sub problem sizes are different like fibonacci. Useful for those that follow pattern of divide-and-conquer.

It is defined as follows:

T(n)= a * T(n/b​) + f(n), where

a = number of times recursion is called
b = number of subproblems
f(n) time taken to merge the results back ~~ O(n^d), where d >= 0

The Master Theorem then provides us three formulas based on the relationship between a, b and d

Case 1: if d < log a / log b (or, log a to the base b)
then, T(n) = O(n ^ (log a to the base b))

this indicates the time taken to merge results is less that the time for recursion. Example: DFS/BFS

Case 2: if d = log a / log b (or, log a to the base b)
then, T(n) = O(n^d (log n)) = O(n^(log a to base b) (log n))

Example: Binary Search, Merge sort

Case 3: if d > log a / log b (or, log a to the base b)
then, T(n) = O(n^d)

Example: Quickselect

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Backtracking algorithm

A
def backtrack(candidate):
    if find_solution(candidate):
        output(candidate)
        return
# iterate all possible candidates.
for next_candidate in list_of_candidates:
    if is_valid(next_candidate):
        # try this partial candidate solution
        place(next_candidate)
        # given the candidate, explore further.
        backtrack(next_candidate)
        # backtrack
        remove(next_candidate)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Problems that could be solved using Recursion

A
  • fibonacci (could also be done using dynamic programming)
  • string permutations
  • tree traversals
  • dfs/bfs
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Dynamic Programming

A

Is mostly just a matter of taking a recursive algorithm and finding the overlapping subproblems (the repeated calls), then cache those results for future recursive calls

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What’s the time complexity of a recursive algorithm?

A

Given a recursion algorithm, its time complexity O(T) is typically the product of the number of recursion invocations (denoted as R) and the time complexity of calculation (denoted as O(s)) that incurs along with each recursion call:

O(T) = R ∗ O(s)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the two parts to calculate space complexity for recursive algorithms

A
  • recursion related (space on stack calls): this space is freed as soon as the function call ends
  • non-recursion related (memoization): space (normally in heap) allocated for global variables
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Is the following snippet a tail recursion?

private static int helper_non_tail_recursion(int start, int [] ls) {
if (start >= ls.length) {
return 0;
}
return ls[start] + helper_non_tail_recursion(start+1, ls);
}

A

not a tail recursion because it does some computation after the recursive call returned.

// we can make it as tail recursion by changing it to this:
private static int helper_tail_recursion(int start, int [] ls, int acc) {
    if (start >= ls.length) {
      return acc;
    }
    // this is a tail recursion because the final instruction is the recursive call.
    return helper_tail_recursion(start+1, ls, acc+ls[start]);
  }
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the benefit of tail recursion? [note: C#, Java, Python does not support tail recursion optimization]

A

The benefit of having tail recursion is that it could avoid the accumulation of stack overheads during the recursive calls. Thus at times help us avoid stack overflow errors.

In tail recursion, we know that as soon as we return from the recursive call we are going to immediately return as well, so we can skip the entire chain of recursive calls returning and return straight to the original caller. Instead of allocating new space on the stack, the system could simply reuse the space allocated earlier for this second recursion call

How well did you know this?
1
Not at all
2
3
4
5
Perfectly