algorithms analysis Flashcards
what does correctness mean in algorithm analysis
an algorithm outputs the expected correct value
what is time efficiency
how fast and alorithm is excuted to get the result. the running time depends on the number of instructions to excute
what is space (memory) efficency
how much memory does the algorithm need during excution
what are the 3 things algorithm analysis look at
- correctness
- time efficency
- space ( memory) efficency
what are some things that affect run time
- hardware
- software
- typically grows with the input size
what are the analysis types
- worst case
- best case
- average case
what are the limitations to runnign experiments for run time
- it is nessary to implement the algorithm which could be hard
- in order to compare two algorithms the same hardware and eviroments must be used
- results may not be indicative of the running time on other inputs not included in the experiment
what is theoretical analysis
- uses pseudocode instead of implementation
- characterizes running time as a function of the input size n
- take into account all possible inputs
- allow us to evaluate the speed of an algorithm independent of teh hardware/software enviroment
how do you determinte the theoritical analysis of running time
determine the number of primitive operations ( basic operations) as a function of the input size
what is a primative operation
- basic computatiions performed by algorithm
- assume to take a constant execution time
- largely independent from the programming language
what are some examples of primative operations
ex
- assigning value to a variable
- evaluating an arithmetic expression or boolean expression
- accessing a single element of an array by index
- calling a method
- returning from a method
if wose case is (3n-1) and 2n is the best case what id the what is the worst case time of it
a(2n) <= T(n) <= (3n-1)
a- is fastest primative operation
b is slowest primative operation
does changing in hardware/ssoftware enviroment affect the growth rate of running time?
- this affects T(n) by a constant factor
- does not alter the growth rate of T(n)
why does growth rate matter
- can distinguish efficient vs inefficient algorthims
- many large scale instances can be solved more easily
- many real world problems have large sizes
- small diffreneces on small input sizes
what does not affect growth rate?
- constant factord
- lower order terms
What is asymptotic analysis
foucuses on the growth rate of the running time as a fucntion of the input size n
- how the running time increases with the input size in the limit as the input size increases without bound
notation
- big oh
- big omega
- big theta
What is some notation asymptotic analysis
- big oh
- big omega
- big theta
Big oh notation formula given f(n) and g(n)
we say f(n) is O(g(n) if there is a real positve constant c> 0 and interger constant n0 >= 1
so formula f(n) <= cg(n) for n>=n0
big omega formula given f(n) and g(n)
we say f(n) is (g(n) if there is a real positve constant c> 0 and interger constant n0 >= 1
f(n) >= cg(n) for n>=n0
we say f(n) is O(g(n)) if both f(n) is O(g(n)) and f(n) is g(n)
what does f(n) us O(g(n) do
- f(n) is of order AT MOST g(n). f(n) is NOT faster then g(n) asymptotically
- g(n) is an asymototic UPPER bound for f(n)
f(n) is ( g(n))
- f(n) is of order at LEAST g(n). f(n) is NOT slower then g(n) asymptotically
- g(n) is an asymototic LOWER bound for f(n)
f(n) is ( g(n))
- f(n) is of order g(n). f(n)
- g(n) is an asymptotic TIGHT bound for f(n)
what is transpose symmetry
f(n) is O)g(n)) iff g(n) is f(n)
symmetry
f(n) is O(g(n)) iff g(n) is f(n)
transitivity
- f(n) is O)g(n)) and g(n) is O(h(n)) –> f(n) is O(h(n))
- f(n) is Og(n)) and g(n) is (h(n)) –> f(n) is O (h(n))
- f(n) is Og(n)) and g(n) is O(h(n)) –> f(n) is O(h(n))
whats the diffrence between polynomial run time and exponenetial run time?
- polynomial running time O(n^c) for some constant c>1
- preferably the constant factors are not very large
expontential running time O(2^n) some constant b>1
- should almost never be seen as efficent
what are the big Oh properties
- drop the lower order term
- drop the constant factors
ex
x + 3x^2 + 2x +9 would be O(n^2)
what are the general rules for loops, nested loops, concecutive statements, and if elseif-else statements
loops
the running time of the statements instide the nested loop times the number of iterations
- nested loops
the running time of the statements insid ethe nested loops times the product of the sizes of all teh loops
-concecutive statements
add the running time
if - elseif -else statemenst
- the running time is the largest running times of the conditional blocks ( remember they are exclusive)
how to get the runnning time of recursion calls
it depends on
- how many recursive activations occur
- for each invocation of the mehtod we only account for the number of operations that are performed within the body of activation
- take the sum over all activations