Course 1 week 1 Flashcards
3-way-Merge Sort : Suppose that instead of dividing in half at each step of Merge Sort, you divide into thirds, sort each third, and finally combine all of them using a three-way merge subroutine. What is the overall asymptotic running time of this algorithm? (Hint: Note that the merge step can still be implemented in O(n)O(n) time.)
nlog(n)
You are given functions ff and gg such that f(n)=O(g(n))f(n)=O(g(n)). Is f(n)log_2(f(n)^c) = O(g(n)log_2(g(n)))f(n)∗log
2
(f(n)
c
)=O(g(n)∗log
2
(g(n))) ? (Here cc is some positive constant.) You should assume that ff and gg are nondecreasing and always bigger than 1.
That’s correct! Roughly, because the constant c in the exponent is inside a logarithm, it becomes part of the leading constant and gets suppressed by the big-Oh notation.
Assume again two (positive) nondecreasing functions ff and gg such that f(n)=O(g(n))f(n)=O(g(n)). Is 2^{f(n)}=O(2^{g(n)})2
f(n)
=O(2
g(n)
) ? (Multiple answers may be correct, you should check all of those that apply.)
Yes if f(n) \le g(n)f(n)≤g(n) for all sufficiently large nn
Sometimes yes, sometimes no (depending on ff and gg)
What does the statement f(n) = O(g(n)) mean? What are some examples that hold true only after a higher constant.
f(n) is bounded by g(n)
There exists a set of constants c and n0, after which f(n) <= c*g(n) for all n > n0.
example : f(n) = 2n+100, g(n) = n**2
**Need more examples* what is n0 and c here?
k-way-Merge Sort. Suppose you are given kk sorted arrays, each with nn elements, and you want to combine them into a single array of knkn elements. Consider the following approach. Using the merge subroutine taught in lecture, you merge the first 2 arrays, then merge the 3^{rd}3
rd
given array with this merged version of the first two arrays, then merge the 4^{th}4
th
given array with the merged version of the first three arrays, and so on until you merge in the final (k^{th}k
th
) input array. What is the running time taken by this successive merging algorithm, as a function of kk and nn? (Optional: can you think of a faster way to do the k-way merge procedure ?)
O(nk2)
For the upper bound, the merged list size is always O(kn)O(kn), merging is linear in the size of the larger array, and there are kk iterations. For the lower bound, each of the last k/2k/2 merges takes \Omega(kn)Ω(kn) time.
Arrange the following functions in increasing order of growth rate (with g(n)g(n) following f(n)f(n) in your list if and only if f(n)=O(g(n))f(n)=O(g(n))).
a)2^{\log(n)}2
log(n)
b)2^{2^{\log(n)}}2
2
log(n)
c)n^{5/2}n
5/2
d)2^{n^2}2
n
2
e)n^2\log(n)n
2
log(n)
aecbd
Exponents and Logarithms - TBD
http://www.wou.edu/mathcenter/files/2015/09/Exponents-and-Logarithms.pdf
karathsuba-code-TBD organize
http://www.keithschwarz.com/interesting/code/karatsuba/Karatsuba.python.html
columbia - asymptotic notation - TBD
http://www.columbia.edu/~cs2035/courses/ieor6614.S11/algal.pdf
karatsuba - implementation notes (good code)
http://www.keithschwarz.com/interesting/code/karatsuba/Karatsuba.python.html
Asymptotic Notation in Seven Words
suppress constant factors | {z} too system-dependent and lower-order terms | {z} irrelevant for large inputs
Add typical algorithms and their python implementation run time
TBD
Input: array A of n integers, and an integer t.
Output: Whether or not A contains t.
for i := 1 to n do if A[i] = t then return TRUE return FALSE
What is the asymptotic run time
O(n)
Input: arrays A and B of n integers each, and an integer t. Output: Whether or not A or B contains t.
What is the asymptotic run time
O(n) -> as we can check each array one at a time
Checking for a Common Element
Input: arrays A and B of n integers each.
Output: Whether or not there is an integer t contained in both A and B.
for i := 1 to n do for j := 1 to n do if A[i] = B[j] then return TRUE return FALSE
O(n**2)