Order of Growth and Asymptotic Notation Flashcards
what are the most commonly encountered orders of growth (in best to worst order)
constant logarithmic linear linear logarithmic quadratic cubic exponential
what does a linear logarithmic function look like
NlogN
what are the best orders of growth for large input sizes
constant
logarithmic
linear
linearlogarithmic
what does a constant run time mean
doesn’t change no matter what the input size is
example of a logarithmic run time algorithm
binary search
example of linear run time algorithm
for loop incs by constant amount each time
example of linear logarithmic algroithm
mergesort
example of quadratic or cubic order or growth run time algorithm
nested loops
example of exponential order of growth run time algroithm
combinational search
example of an algorithm with constant run time
statements and assignments
how does binary search work
array is sorted
low high and mid index
continuously divides in half based on if the middle index is greater or less than the number being searched for until the middle index is the number being searched for
can algorithms have more than one order of growth
yes, may change depending on input size
which order of growth of an algorithm is the most important
the asymptotic one
ie the one as the function tends to infinity
what are the three symbols used for describing asymptotic order or growth
Θ
O
Ω
Big theta shows what order or growth
the asymptotic order of growth
used to classify algorithms
when is big o used
to show upper bounds of algorithm
when is big omega used
to show lower bounds of algotihm
Every piece of code should have a big omega of
1
ie a constant running time should be the lower bound
worst case analysis (NOT BIG O) can be found by using what type of input
the most difficult, largest input
the average case analysis (NOT BIG ALPHA) can be found using what input
random
what is the upper bound of an algorithm
the performance guarantee of the algorithm
ie. cannot do any worse than
what is the lower bound of an algorithm
proof that the algorithm cannot perform any better than that
where is the optimal algorithm
WHEN the lower bound = upper bound
you can prove that no other algorithm can run with a lower worse case running time than the one you have made
will grow the slowest as N tends to infinity
what is the runtime for copying an array into a new spacewil
Θ(N)
N = size of the array
putting in the easiest possible income to an algorithm will gove
lowest bound
is it more common to count swaps or comparisons when coming up with the time complexity of a sorting or searching algorithsm
comparisons
This is because comparisons have similar cost to swaps and usually there are more comparisons than swaps
O(f + g) =
O(max(f ,g))
What does it mean when we say that an algorithm X is asymptotically more efficient than Y?
X will always be a better choice for large inputs
Algorithm A and B have a worst-case running time of O(n) and O(logn), respectively. Therefore, algorithm B always runs faster than algorithm A.
True
False
False
The Big-O notation provides an asymptotic comparison in the running time of algorithms. For n < n0, algorithm A might run faster than algorithm B, for instance.