Time Complexity Flashcards

Understanding the running time of algorithms and their order of growth

1
Q

Time Complexity in increasing order

A

O(1) < O(log n) < O(n) < O(n log n) < O(2^n) < O(n!)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Time complexity of NP complete problems

A

Factorial or Exponential

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Exponential time vs Factorial time

A

Exponential time < Factorial time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Examples of exponential time

A

2^n or x^n where x is any value. As n increases the time taken increases exponentially

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Examples of quadratic time

A

n^2, n^2 + x*n + c etc

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Quasilinear time

A

n * log n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Polynomial time

A

n ^ c. eg. n^2, n^10 etc

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Log logarithmic time

A

log log n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Which is greater 0.001 * 1.5^n or 1000 * n^3?

A

0.001 * 1.5^n&raquo_space;> 1000 * n^3. Exponential cannot be beaten by a polynomial

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is amortized time?

A

Amortized time is the way to express the time complexity when an algorithm has the very bad time complexity only once in a while besides the time complexity that happens most of time.
Example ArrayList insert expands to double and saves for later operations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is asymptotic analysis?

A

In Asymptotic Analysis, we evaluate the performance of an algorithm in terms of input size (we don’t measure the actual running time). We calculate, how does the time (or space) taken by an algorithm increases with the input size. Order of growth of algorithm with respect to input size

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

In asymptotic analysis, 1000 * n log n vs 2 * n log n?

A

Both are same for asymptotic analysis which is O(n log n). We ignore constants.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Let’s say Selection sort takes T(n) = n^2 to run. If sorting a deck of cards took 2 hours, how long will it take to sort 10 decks of unsorted cards?

A
First T(10*n) / T(n) = (10*n)^2 / n * 2 = 100.
100 times more time will be required, so 2 * 100 = 200 hours.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Time complexity of Power Set?

A

Exponential. O(2^n). Consider example of fragrances that can be created with set of flowers. With every new flower, the number of fragrances doubles. So it is exponential.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is power set?

A

A power set is a set of all possible subsets of the set. So for a set A = {1, 2, 3}, the power set will be PowerSet(A) = {{}, {1}, {2}, {1,2}, {3}, {1, 3}, {2, 3}, {1, 2, 3}}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is rate of growth?

A

The rate at which the running time increases as a function of input is called rate of growth.

17
Q

Which are the three types of analysis of algorithms

A

1) Best case (Lower bound)
Defines the input for which the algorithm takes least time
2) Average case (Average time)
Provides prediction about running time of algorithm, Assumes input is random
3) Worst case (Upper bound)
Defines input for which algorithm takes long time

18
Q

Big Oh notation? Which type of bound is it?

A

Big O is asymptotic tight upper bound of the given function.
For example if f(n) = n^4 + 100n^2 + 10n + 20, then O(n^4) is the rate of growth as for higher values of n, as n^4 gives maximum rate of growth for f(n) at larger values of n.

19
Q

Omega notation? Which type of bound is it?

A

Omega is tighter lower bound and is represented as Omega(g(n)). It is asymptotic tight lower bound for f(n).

20
Q

Formal definition of Big O?

A

O(g(n)) = {f(n): there exists positive constants c and n0 such that 0 <= f(n) <= c*g(n) for all n >= n0}
Our objective is to give smallest rate of growth g(n) which is greater than or equal to given algorithms rate of growth f(n).

21
Q

Formal definition of Omega?

A

Omega(g(n)) = {f(n): There exists positive constants c and n0 such that 0 <= c*g(n) <= f(n) for all n >= n0}

22
Q

What is Theta notation?

A

This notation decides whether the upper and lower bound of a given function are same. The average running time of algorithm is always between lower bound and upper bound. If lower and upper bound give same result then theta will also have same rate of growth.
For example f(n) = 10*n + n
The tight upper bound g(n) is O(n). The rate of growth in best case Omega is Omega(n). In this case Theta will also be n.

23
Q

How do we calculate Theta when lower and upper bounds differ?

A

If rate of growths of lower and upper bound are not same then the rate of growth Theta case may not be same. In this case, we need to consider all possible time complexities and take average of those, in quick sort for example.

24
Q

Formal definition of Theta?

A

Theta(g(n)) = {f(n): there exist positive constants c1, c2, n0 such that 0<= c1g(n) <= f(n) <= c2g(n) for all n >= n0}

25
Q

Is there any importance of knowing Omega or best case time of algorithm?

A

No there is no practical importance of finding lower bound. So we find upper bound O and use Theta when lower and upper bound are same.

26
Q

Why is rate of growth called Asymptotic analysis?

A

We are trying to find function g(n) which approximates f(n) at higher values of n. That means g(n) is also a curve which approximates f(n) at higher values of n. In mathematics such a curve is called Asymptotic curve. That is why we call algorithm analysis asymptotic analysis.

27
Q

Explain logarithm complexity

A

In logarithm complexity if it takes constant time to cut the problem by fraction (1/2).
for (int i=1;i

28
Q

Explain logarithm complexity

A

In logarithm complexity if it takes constant time to cut the problem by fraction (1/2).
for (int i=1;i

29
Q

Common log formulas

A
log x*y = log x + log y
log x/y = log x - log y
log x^y = y log x
log^k n = (log k)^n
log base b (x) = log base a (x) / log base a (b)
a^log base b (x) = x^log base b (a)
30
Q

Arithmetic series

A

1 + 2 + ….. + n = n (n + 1) / 2

31
Q

Geometric series

A

1 + x^1 + x^2 + x^3 + …. + x^n = x^(n+1) - 1 / x - 1

32
Q

Harmonic series

A

1 + 1/2 + 1/4 + …. + 1/n = log n

33
Q

Master theorem for Divide and Conquer

A

2T(n/2) + O(n)

34
Q

Geometric series

A

1 + x^1 + x^2 + x^3 + …. + x^n = x^(n+1) - 1 / x - 1

35
Q

Harmonic series

A

1 + 1/2 + 1/4 + …. + 1/n = log n

36
Q

Master theorem for Divide and Conquer

A

2T(n/2) + O(n)

37
Q

Solve time complexity for T(n) = 3*T(n - 1)

A
T(n) = 3 * T(n - 1)
 = 3 * 3 * T (n - 2)
 = 3 ^ 3 * T (n - 3)
 = ....
 = 3 ^ n * T (n - n) = 3 ^ n * T(0) = 3 ^ n

So time complexity is 3 ^ n

38
Q

What is upper bound for n!?

A

n ^ n is upper bound for n!

39
Q

What is upper bound for log (n!)?

A

We know that upper bound for n! is n ^ n, so using that we have upper bound for log (n!) to be log (n ^ n) = n log n.
So answer is n log (n)