Recursion Flashcards
Recursion :
Resolve a large problem by reducing them to smaller problems of same form.
Recursion as Lazy manager :
As a manager you need to raise 100000$ you find 10 people to raise 10000$ each. They are also lazy so they delegate to 10 people to raise 1000$ each who in turn again delegate to 10 people to raise 100$ each.
function raiseMoney(n):
if n <=100:
return (collect money from them)
else:
find 10 volunteers
get each of them to collect n/10 dollars
combine money raised by each volunteers
Mathematical functions in Recursion :
Factorial of a number is the most popular example of recursion use case.
Basic Combinatorics :
Rule of Sum : If action can be performed by choosing A diff options or B diff options it can be chosen A+B ways.
eg : 4 short sleeved shirts, 6 long sleeved shirts can be chosen in 4+6 ways = 10 ways
Rule of Product : If action can be performed by choosing one of A options followed by one of B options then it is A * B ways
eg : 10 shirts and 8 pants in closet can be chosen in 10*8 = 80 ways
Arrange a, b, c & d in straight line :
Without repetition(permutation) : 4 * 3 * 2 * 1 =. 24 ways With repition : 4 * 4 * 4 * 4 = 4^4 = 256 ways
Binary string is a classic example of arrangement with repetition. Each blank can be filled with 0 or 1
N-factorial :
Top down approach is a decrease and conquer recursive approach.
n * (n-1) * (n-2) * (n-3) * ………1
def fact(n): if n == 0: return else: return n * fact(n-1)
Bottom up approach is a iterative approach :
def fact(n): f = 1 for i in range(1, n+1): f = f * i return f
Time Complexity of recursive vs Iterative :
Bottom Up iterative approach for factorial of N calculation takes O(n) time complexity.
Top Down recursive solution time complexity is calculated by number of activation records that get added to main call stack.
Example for calculating factorial(n) where n = 6:
main -> fact(6) -> fact(5) -> fact(4) -> fact(3) -> fact(2) -> fact(1) -> fact(0) fact(0) returns 1 fact(1) returns 1 fact(2) returns 2 fact(3) returns 6 fact(4) returns 24 fact(5) returns 120 fact(6) returns 720 to main
Since roughly there are n call stacks added to main, time complexity is O(n)
Another example n^k
def raiseNtoK(n, k): if k == 0: return 1 else: return n * raiseNtoK(n, k-1)
Here the number of call stacks added to main is k. therefore time complexity is O(k)
Number of Subsets of a set :
{a,b,c}
For the above set the number of subsets will be :
{} , {a}, {b}, {c}, {a,b}, {b,c}, {a,c}, {a,b,c}
The total number of subsets is 8 which is 2^3 or 2^n where n=3
To solve this problem recursively we can do a decrease and conquer approach :
Manager gets n elements in a subset -> delegates to his delegate subset of n-1 -> he delegates to his delegate a subset of n-2 -> he delegates to his delegate a subset of n-3 -> he delegates to his delegate a subset of n-4 -> and so on
Finally the final delegate gets 0 elements so he returns 1. The level befire that returns 2 and the level before that returns 4 and so on
So we can infer that each level it is 2 * (number of subsets in previous level) or 2 * 2^(n-1)
def number_of_subsets(n): if n == 0: return 1 else: return 2 * number_of_subsets(n-1)
So here, Total work done is no of workers * job by each worker. Since it is a constant work by each worker total work done witll be equal to number of workers. Therefore, time complexity is O(n)
But what if we write the code like the following?
def number_of_subsets(n): if n == 0: return 1 else: return number_of_subsets(n-1) + number_of_subsets(n-1)
The above code returns the same answer and is correct but what is the time complexity here?
To visualize, here a manager gets n elements in a set to find number of subsets for. He will delegate to 2 delgates with n-1 sets in each.
Who will further delegate to 2 delegates each so there will be 4 delegeates in next level and so on…
levels : level 0 -> level 1 -> level 2 -> level 3 -> …………. -> level n
no of elements : 1 -> 2 -> 4 -> 8 -> …………… -> n
So at each level i there are 2^i workers. Total work done is no of workers * job by each worker. Since it is a constant work by each worker total work done witll be equal to number of workers
Total number of workers =
2^0 + 2^1 + 2^2 + 2^3 + …………….. + 2^n
This is a geometric series with a = 1 and r = 2 where sum is a(1-r^n)/(1-r)
Therefore total number of workers = 2^n - 1
Total time complexity is O(2^n) in this case
Fibnocci
def fib(n): if n == 0 or n == 1: return n else: return fib(n-1) + fib(n-2)
fib(3) < -fib(4) -> fib(2) fib(2) fib(1) fib(1) fib(0) fib(1) fib(0)
as you can see, from fib(4) the right side gets over in 2 levels which is n/2 levels where n = 4. Tree structure is not symmetric
Time complexity = ?
Time complexity is proportional to no of nodes since each node is doing constamt work.
Till n/2 it is completely filled so total number of nodes till n/2 is :
1 + 2^1 + 2^2 + 2^3 + ………. + 2^(n/2)
= > 2 ^ (n/2) - 1
Therefore, time complexity = O(sqrt(2)^n)
If entire tree is completely filled then time complexity is O(2^n)
For fibanocci it is going to lie somewhere between
O(sqrt(2)^n) <= T(n) <= O(2^n)
But if you look at the code, a number of calls get repeated like fib(2), fib(1) etc. In such cases, we can cache them and retrive them again instead of calculatig all oevr again. This is called memoization
Another approach:
Bottom up iterative approach similar to DP:
def fib(n, a, b): if n == 0: return a c = a + b return fib(n, b, c)