Complexity and Big O Flashcards
The execution tree of a recursive function would form an n-ary tree, with n being the branches or the number of times recursion appears in the recursion relation.
When you have a recursive function that makes multiple calls the runtime will often look like:
O(branches ^ depth)
As mentioned in the question, branches denote the number of times the recursion appears in the recursion relation. The depth varies based on each problem. For instance, a fibonacci series [f(n) = f(n-1) + f(n-2)] yields us a binary tree and goes all the way down to the 0th element thus the depth is the number of elements, n.
This gives us O(2^n) time complexity where n is the number of elements and we have 2 branches f(n-1) & f(n-2).
Though, to be more strictly speaking and if we derive it purely mathematical way, we get the complexity as O(1.6^N)
Sum of 1 through (N-1)
N(N-1)/2
Sum of a sequence of powers of two is roughly equal to ___ ?:
The next value in the sequence.
So, 2^0 + 2^1 + 2^2 … 2^N = 2^(N+1) - 1 = 2^N - 1
Number of times we can half n until we get 1 is ___ ?:
log n
When you see a problem where the number of elements in the problem space gets halved each time, that will likely be a ___ runtime:
O(log N)
eg.
1. binary search on an N element sorted array
2. Finding an element in Balanced binary search tree
The term linearithmic is used to describe programs whose running time for a problem of size N has order of growth of ___:
N (log N)
Binary Search Running Time:
O(log N)
Merge sort runtime complexity:
O(N logN)
Quick Sort runtime complexity:
O(N logN)
Quadratic runtime complexity examples:
Selection sort and Insertion sort
Selection sort runtime complexity:
O(N^2)
Insertion sort runtime complexity:
O(N^2)
How would you find maximum number of digits a number can represent. For eg. How many digits can 999 have? This is good to know when you have to iterate through the digits for a given number:
digits ~ log Z
Let’s say Z is a 2 digit number then the max 2 digit number would be 99 = 10^2 - 1 ~ 10^2. Similarly Z being a d digit number would give the max value look like:
Z ~ 10^d
taking log on each side, log Z = d log 10 = d
What is order of dominance for- O(X log X) O(X^2) O(2^X) O(X) O(X!) O(log X):
O(X!) > O(2^X) > O(X^2) > O(X log X) > O(X) > O(log X)
Which is dominant 2 ^ N or N ^ 2 when deciding the big O for an algorithm that has complexity O(2^N + N^2):
2^N is dominant, hence the complexity will boil down to O(2^N)