Full Course Flashcards
Explain informally, why f(n) = Θ(n) implies f(n) = Ω(n) and f(n) =
O(n).
f(n) = Θ(n) means that f(n) has growth rate n. But then f(n) also has growth rate at least n (i.e., f(n) = Ω(n)) and at most n (i.e., f(n) = O(n))
Give definitions of the big-O, big-Ω, and big-Θ notations
O(g(n)) = { f(n) : there exist positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0}
Ω(g(n)) = { f(n) : there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0}
Θ(g(n)) = { f(n) : there exist positive constants c1, c2 and n0 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0}
Explain using the definitions why f(n) = Θ(n) when f(n) = Ω(n) and f(n) = O(n).
Let f(n) = O(g(n)) and f(n) = Ω(g(n)). Then, by definition of the big-O notation, there exist positive constants c1 and n1 such that 0 ≤ f(n) ≤ c1g(n) for all n ≥ n1. On the other hand, by definition of the big-Ω notation, there exist positive constants c2 and n2 such that 0 ≤ c2g(n) ≤ f(n) for all n ≥ n2.
Thus, there exist positive constants c1, c2, and n0 such that 0 ≤ c2g(n) ≤ f(n) ≤ c1g(n) for all n ≥ n0, where we choose n0 = max(n1, n2). Then, by definition of the big-Θ notation, f(n) = Θ(g(n)).
If x occurs in A several times, which index is returned?
A: Both return the index of the first occurrence of x
B: Linear-Search returns the index of the last occurrence of x, Better-Linear-Search the first
C: Linear-Search returns all indices of x, Better LinearSearch only the first
D: Both return all indices
Linear-Search returns the index of the last occurrence of x, Better-Linear-Search the first
Big-O notation – Upper bounding function
Theta-Θ notation – Order function
Omega-Ω notation – Lower bounding function
The running time is never worse than given function , asymptotically smaller or equal
The rates of growth in the best case and worst case are the same, asymptotically equal
the running time is never better than a given function, asymptotically greater or equal
Claim: 19n^3 + 17n^2 - 3n = Θ(n^3)
If we choose n0 = 1, what is a good choice for c2?
A: c2 = 1
B: c2 = 17
C: c2 = 19
D: c2 = 36
36
500 n + 150 = O(n^2)
True
4 n log^2 (n) + 20 n = Θ(n log n)
False
n log n = Ω(n)
True
n log^2(n) = O(n^(4/3))
True
57 n^2 + 17 log n = Θ(n^2)
True
O(n) ⊆ O(n^4)
True
An algorithm with worst case running time O(n log n)
is always slower than an algorithm with worst case
running time O(n) if n is sufficiently large.
False
Loop invariance
property that remains ture before and after the iteration of the loop
Properties of Loop Invariance
- Initialisation: It is true even before the loop starts.
- Maintenance: If its true it will remain true before the next iteration.
- Termination : when the loop finishes, the loop invariant provides a useful property that helps to demonstrate that algo is correct