Computational Complexity Flashcards
Mostly about Big Oh Notation
Understand efficiency of programs
Separate time and space efficiency of a program
Challenges
- a program can be implemented in many different ways
- solve a problem using only a handful of different algorithms
- separate choices of implementation from abstract algorithms
Evaluate efficiency
- use a timer
- count operations
- abstract notion of order of growth
Timing a program
- start clock
- call function
- stop clock
Timing programs is inconsistent
GOAL to evaluate diff. algorithms
Running time
-(time varies for different inputs, but cannot really express a relationship between inputs and time!)
+ varies between algorithms
- varies between implementations
- varies between computers
- not predictable based on small inputs
Which operations to count
Assumes steps take constant time
- mathematical operation
- comparisons
- assignments
- accessing objects in memory
Counting operations, better but…
GOAL to evaluate different algorithms
Counting operations
+(count varies for diff. inputs and can come up with a relationship between inputs and the count)
+ depends on algorithm
- depends on implementation
- independent of computers
+ no real def. of which operations to count
Needs a better way
what we HAVE
- timing and counting evaluate implementations
* timing evaluates machines
what we WANT
- evaluate algorithm
- evaluate scalability
- evaluate in terms of input size
Need to which input to use
- efficiency in terms of input
- could be an integer
- could be length of list
- you decide
Different inputs
- first element in a list -> BEST CASE
- not in list -> WORST CASE
- look through half -> AVG CASE
Best case
min. running time
- constant
- first element of any list
Average case
avg. running time
- practical measure
Worst case
max. running time
- linear in length of list
- must search entire list
Orders of Growth - goal
- input is very big
- growth of program’s run time
- upper bound
- do not need to be precise “order of” not “exact”
- largest factor
Input is very big
Want to evaluate programs efficiency when input is very big
Growth of program’s run time
Want to express program’s run time as input grows
Upper bound
Want to put an upper bound on growth
“order of” not “exact”
Do not need to be precise “order of” not “exact” growth
Largest factor
We will look at largest factor in run time
Types of orders of growth
- constant \+ linear - quadratic - logarithmic \+ n log n - exponential
Measuring Order of Growth
Big Oh Notation
Big Oh Notation
Measures
- upper bound on the asymptotic growth
- is used to describe worst case
Worst Case
- Occurs often
- Is the bottleneck
- Express rate of growth is relative to input size
- Evaluate algorithm not machine or implementation
Worst case asymptotic complexity
- ignore additive constants
- ignore multiplicative constants
Simplification
- drop constants
- drop multiplicative factors
+ focus on dominant terms
Examples
O(n^2) -> n^2 + 2^n + 2 O(n^2) -> n^2 + 100000n + 3^1000 O(n) -> log(n) + n + 4 O(n log n) -> 0.0001*n*log(n) + 300n O(3^n) -> 2^n30 + 3^n
complexity
ordered low to high
O(1) -> O(log n) -> O(n) ->
O(n log n) -> O(n^C) -> O(C^n)
Analyse programs and their complexity
- Combine complexity classes
- Law of addition for O()
Combine complexity classes
- analyse statements inside functions
- apply some rules, focus on dominant term
Law of addition for O()
- Used with sequential statements
- O(f(n)) + O(g(n)) is O(f(n) + g(n))
- O(n) + O(n*n) = O(n+n^2) = O(n^2)
O(1)
denotes constant running time
O(log n)
denotes logarithmic running time
O(n)
denotes linear running time
O(n^C)
denotes polynomial running time (c is constant)
O(C^n)
denotes exponential running time (c is a constant being raised to a power based on size of input)
Constant Complexity
- independent of inputs
- very few interesting algorithms in this class
- can have loops or recursive calls
Logarithmic Complexity
- Grows as log of size of one of its inputs
- examples:
- bisection search
- binary search of a list
Linear Complexity
- Searching a list in sequence to see if an element is present
- add char to a string
Linear recursive complexity
- complexity can depend on number of recursive calls
- number of times around the loop is n
- number of operations inside loop is a constant
Log-Linear complexity
- many practical algorithms are log-linear
- like the merge sort
Polynomial Complexity
- most are quadratic n^2
- commonly for
- nested loops
- recursive functions calls
o() for nested loops
(see file “example-analysing-complexity.py” in week-6.
- computes n^2 very inefficiently
- look at the ranges
- nested loops, each iterating n times
When input is a list
Must define what size of input means
- previously it was the magnitude of a number
- here it is the length of list
Common Python functions
List
index, store, length append
=> O(1)
==, remove, copy, reverse, iteration, in list
=> O(n)
Common Python functions
Dictionaries
Worst Case:
index, store, length, delete, iteration
=> O(n)
Avg case is however often O(1)
Big Oh summary
compare efficiency of algortihms
- notation that describes growth
- lower order of growth is better
- independent of machine or specific implementation
Big Oh summary
use Big Oh
- describe order of growth
- asymptotic notation
- upper bound
- worst case analysis
Notes on Big Oh
https://prod-edxapp.edx-cdn.org/assets/courseware/v1/b874db7cc14e722ce7b921d11d944026/asset-v1:MITx+6.00.1x+2T2019+type@asset+block/handouts_Big_O.pdf