Time Complexity Flashcards
Understanding the running time of algorithms and their order of growth
Time Complexity in increasing order
O(1) < O(log n) < O(n) < O(n log n) < O(2^n) < O(n!)
Time complexity of NP complete problems
Factorial or Exponential
Exponential time vs Factorial time
Exponential time < Factorial time
Examples of exponential time
2^n or x^n where x is any value. As n increases the time taken increases exponentially
Examples of quadratic time
n^2, n^2 + x*n + c etc
Quasilinear time
n * log n
Polynomial time
n ^ c. eg. n^2, n^10 etc
Log logarithmic time
log log n
Which is greater 0.001 * 1.5^n or 1000 * n^3?
0.001 * 1.5^n»_space;> 1000 * n^3. Exponential cannot be beaten by a polynomial
What is amortized time?
Amortized time is the way to express the time complexity when an algorithm has the very bad time complexity only once in a while besides the time complexity that happens most of time.
Example ArrayList insert expands to double and saves for later operations.
What is asymptotic analysis?
In Asymptotic Analysis, we evaluate the performance of an algorithm in terms of input size (we don’t measure the actual running time). We calculate, how does the time (or space) taken by an algorithm increases with the input size. Order of growth of algorithm with respect to input size
In asymptotic analysis, 1000 * n log n vs 2 * n log n?
Both are same for asymptotic analysis which is O(n log n). We ignore constants.
Let’s say Selection sort takes T(n) = n^2 to run. If sorting a deck of cards took 2 hours, how long will it take to sort 10 decks of unsorted cards?
First T(10*n) / T(n) = (10*n)^2 / n * 2 = 100. 100 times more time will be required, so 2 * 100 = 200 hours.
Time complexity of Power Set?
Exponential. O(2^n). Consider example of fragrances that can be created with set of flowers. With every new flower, the number of fragrances doubles. So it is exponential.
What is power set?
A power set is a set of all possible subsets of the set. So for a set A = {1, 2, 3}, the power set will be PowerSet(A) = {{}, {1}, {2}, {1,2}, {3}, {1, 3}, {2, 3}, {1, 2, 3}}
What is rate of growth?
The rate at which the running time increases as a function of input is called rate of growth.
Which are the three types of analysis of algorithms
1) Best case (Lower bound)
Defines the input for which the algorithm takes least time
2) Average case (Average time)
Provides prediction about running time of algorithm, Assumes input is random
3) Worst case (Upper bound)
Defines input for which algorithm takes long time
Big Oh notation? Which type of bound is it?
Big O is asymptotic tight upper bound of the given function.
For example if f(n) = n^4 + 100n^2 + 10n + 20, then O(n^4) is the rate of growth as for higher values of n, as n^4 gives maximum rate of growth for f(n) at larger values of n.
Omega notation? Which type of bound is it?
Omega is tighter lower bound and is represented as Omega(g(n)). It is asymptotic tight lower bound for f(n).
Formal definition of Big O?
O(g(n)) = {f(n): there exists positive constants c and n0 such that 0 <= f(n) <= c*g(n) for all n >= n0}
Our objective is to give smallest rate of growth g(n) which is greater than or equal to given algorithms rate of growth f(n).
Formal definition of Omega?
Omega(g(n)) = {f(n): There exists positive constants c and n0 such that 0 <= c*g(n) <= f(n) for all n >= n0}
What is Theta notation?
This notation decides whether the upper and lower bound of a given function are same. The average running time of algorithm is always between lower bound and upper bound. If lower and upper bound give same result then theta will also have same rate of growth.
For example f(n) = 10*n + n
The tight upper bound g(n) is O(n). The rate of growth in best case Omega is Omega(n). In this case Theta will also be n.
How do we calculate Theta when lower and upper bounds differ?
If rate of growths of lower and upper bound are not same then the rate of growth Theta case may not be same. In this case, we need to consider all possible time complexities and take average of those, in quick sort for example.
Formal definition of Theta?
Theta(g(n)) = {f(n): there exist positive constants c1, c2, n0 such that 0<= c1g(n) <= f(n) <= c2g(n) for all n >= n0}