Computational Complexity Flashcards

Mostly about Big Oh Notation

1
Q

Understand efficiency of programs

A

Separate time and space efficiency of a program

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Challenges

A
  • a program can be implemented in many different ways
  • solve a problem using only a handful of different algorithms
  • separate choices of implementation from abstract algorithms
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Evaluate efficiency

A
  • use a timer
  • count operations
  • abstract notion of order of growth
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Timing a program

A
  1. start clock
  2. call function
  3. stop clock
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Timing programs is inconsistent

A

GOAL to evaluate diff. algorithms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Running time

-(time varies for different inputs, but cannot really express a relationship between inputs and time!)

A

+ varies between algorithms

  • varies between implementations
  • varies between computers
  • not predictable based on small inputs
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Which operations to count

A

Assumes steps take constant time

  • mathematical operation
  • comparisons
  • assignments
  • accessing objects in memory
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Counting operations, better but…

A

GOAL to evaluate different algorithms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Counting operations

+(count varies for diff. inputs and can come up with a relationship between inputs and the count)

A

+ depends on algorithm
- depends on implementation
- independent of computers
+ no real def. of which operations to count

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Needs a better way

what we HAVE

A
  • timing and counting evaluate implementations

* timing evaluates machines

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

what we WANT

A
  • evaluate algorithm
  • evaluate scalability
  • evaluate in terms of input size
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Need to which input to use

A
  • efficiency in terms of input
  • could be an integer
  • could be length of list
  • you decide
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Different inputs

A
  • first element in a list -> BEST CASE
  • not in list -> WORST CASE
  • look through half -> AVG CASE
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Best case

A

min. running time
- constant
- first element of any list

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Average case

A

avg. running time

- practical measure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Worst case

A

max. running time
- linear in length of list
- must search entire list

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Orders of Growth - goal

A
  • input is very big
  • growth of program’s run time
  • upper bound
  • do not need to be precise “order of” not “exact”
  • largest factor
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Input is very big

A

Want to evaluate programs efficiency when input is very big

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Growth of program’s run time

A

Want to express program’s run time as input grows

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Upper bound

A

Want to put an upper bound on growth

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

“order of” not “exact”

A

Do not need to be precise “order of” not “exact” growth

22
Q

Largest factor

A

We will look at largest factor in run time

23
Q

Types of orders of growth

A
- constant
\+ linear
- quadratic
- logarithmic
\+ n log n
- exponential
24
Q

Measuring Order of Growth

A

Big Oh Notation

25
Big Oh Notation
Measures - upper bound on the asymptotic growth - is used to describe worst case
26
Worst Case
- Occurs often - Is the bottleneck - Express rate of growth is relative to input size - Evaluate algorithm not machine or implementation
27
Worst case asymptotic complexity
- ignore additive constants | - ignore multiplicative constants
28
Simplification
- drop constants - drop multiplicative factors + focus on dominant terms
29
Examples
``` O(n^2) -> n^2 + 2^n + 2 O(n^2) -> n^2 + 100000n + 3^1000 O(n) -> log(n) + n + 4 O(n log n) -> 0.0001*n*log(n) + 300n O(3^n) -> 2^n30 + 3^n ```
30
complexity | ordered low to high
O(1) -> O(log n) -> O(n) -> | O(n log n) -> O(n^C) -> O(C^n)
31
Analyse programs and their complexity
- Combine complexity classes | - Law of addition for O()
32
Combine complexity classes
- analyse statements inside functions | - apply some rules, focus on dominant term
33
Law of addition for O()
- Used with sequential statements - O(f(n)) + O(g(n)) is O(f(n) + g(n)) - O(n) + O(n*n) = O(n+n^2) = O(n^2)
34
O(1)
denotes constant running time
35
O(log n)
denotes logarithmic running time
36
O(n)
denotes linear running time
37
O(n^C)
denotes polynomial running time (c is constant)
38
O(C^n)
denotes exponential running time (c is a constant being raised to a power based on size of input)
39
Constant Complexity
- independent of inputs - very few interesting algorithms in this class - can have loops or recursive calls
40
Logarithmic Complexity
- Grows as log of size of one of its inputs - examples: - bisection search - binary search of a list
41
Linear Complexity
- Searching a list in sequence to see if an element is present - add char to a string
42
Linear recursive complexity
- complexity can depend on number of recursive calls - number of times around the loop is n - number of operations inside loop is a constant
43
Log-Linear complexity
- many practical algorithms are log-linear | - like the merge sort
44
Polynomial Complexity
- most are quadratic n^2 - commonly for - nested loops - recursive functions calls
45
o() for nested loops | (see file "example-analysing-complexity.py" in week-6.
- computes n^2 very inefficiently - look at the ranges - nested loops, each iterating n times
46
When input is a list
Must define what size of input means - previously it was the magnitude of a number - here it is the length of list
47
Common Python functions | List
index, store, length append => O(1) ==, remove, copy, reverse, iteration, in list => O(n)
48
Common Python functions | Dictionaries
Worst Case: index, store, length, delete, iteration => O(n) Avg case is however often O(1)
49
Big Oh summary | compare efficiency of algortihms
- notation that describes growth - lower order of growth is better - independent of machine or specific implementation
50
Big Oh summary | use Big Oh
- describe order of growth - asymptotic notation - upper bound - worst case analysis
51
Notes on Big Oh
https://prod-edxapp.edx-cdn.org/assets/courseware/v1/b874db7cc14e722ce7b921d11d944026/asset-v1:MITx+6.00.1x+2T2019+type@asset+block/handouts_Big_O.pdf