Computational Complexity Flashcards

Mostly about Big Oh Notation

1
Q

Understand efficiency of programs

A

Separate time and space efficiency of a program

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Challenges

A
  • a program can be implemented in many different ways
  • solve a problem using only a handful of different algorithms
  • separate choices of implementation from abstract algorithms
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Evaluate efficiency

A
  • use a timer
  • count operations
  • abstract notion of order of growth
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Timing a program

A
  1. start clock
  2. call function
  3. stop clock
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Timing programs is inconsistent

A

GOAL to evaluate diff. algorithms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Running time

-(time varies for different inputs, but cannot really express a relationship between inputs and time!)

A

+ varies between algorithms

  • varies between implementations
  • varies between computers
  • not predictable based on small inputs
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Which operations to count

A

Assumes steps take constant time

  • mathematical operation
  • comparisons
  • assignments
  • accessing objects in memory
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Counting operations, better but…

A

GOAL to evaluate different algorithms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Counting operations

+(count varies for diff. inputs and can come up with a relationship between inputs and the count)

A

+ depends on algorithm
- depends on implementation
- independent of computers
+ no real def. of which operations to count

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Needs a better way

what we HAVE

A
  • timing and counting evaluate implementations

* timing evaluates machines

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

what we WANT

A
  • evaluate algorithm
  • evaluate scalability
  • evaluate in terms of input size
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Need to which input to use

A
  • efficiency in terms of input
  • could be an integer
  • could be length of list
  • you decide
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Different inputs

A
  • first element in a list -> BEST CASE
  • not in list -> WORST CASE
  • look through half -> AVG CASE
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Best case

A

min. running time
- constant
- first element of any list

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Average case

A

avg. running time

- practical measure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Worst case

A

max. running time
- linear in length of list
- must search entire list

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Orders of Growth - goal

A
  • input is very big
  • growth of program’s run time
  • upper bound
  • do not need to be precise “order of” not “exact”
  • largest factor
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Input is very big

A

Want to evaluate programs efficiency when input is very big

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Growth of program’s run time

A

Want to express program’s run time as input grows

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Upper bound

A

Want to put an upper bound on growth

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

“order of” not “exact”

A

Do not need to be precise “order of” not “exact” growth

22
Q

Largest factor

A

We will look at largest factor in run time

23
Q

Types of orders of growth

A
- constant
\+ linear
- quadratic
- logarithmic
\+ n log n
- exponential
24
Q

Measuring Order of Growth

A

Big Oh Notation

25
Q

Big Oh Notation

A

Measures

  • upper bound on the asymptotic growth
  • is used to describe worst case
26
Q

Worst Case

A
  • Occurs often
  • Is the bottleneck
  • Express rate of growth is relative to input size
  • Evaluate algorithm not machine or implementation
27
Q

Worst case asymptotic complexity

A
  • ignore additive constants

- ignore multiplicative constants

28
Q

Simplification

A
  • drop constants
  • drop multiplicative factors
    + focus on dominant terms
29
Q

Examples

A
O(n^2) -> n^2 + 2^n + 2
O(n^2) -> n^2 + 100000n + 3^1000
O(n) -> log(n) + n + 4
O(n log n) -> 0.0001*n*log(n) + 300n
O(3^n) -> 2^n30 + 3^n
30
Q

complexity

ordered low to high

A

O(1) -> O(log n) -> O(n) ->

O(n log n) -> O(n^C) -> O(C^n)

31
Q

Analyse programs and their complexity

A
  • Combine complexity classes

- Law of addition for O()

32
Q

Combine complexity classes

A
  • analyse statements inside functions

- apply some rules, focus on dominant term

33
Q

Law of addition for O()

A
  • Used with sequential statements
  • O(f(n)) + O(g(n)) is O(f(n) + g(n))
  • O(n) + O(n*n) = O(n+n^2) = O(n^2)
34
Q

O(1)

A

denotes constant running time

35
Q

O(log n)

A

denotes logarithmic running time

36
Q

O(n)

A

denotes linear running time

37
Q

O(n^C)

A

denotes polynomial running time (c is constant)

38
Q

O(C^n)

A

denotes exponential running time (c is a constant being raised to a power based on size of input)

39
Q

Constant Complexity

A
  • independent of inputs
  • very few interesting algorithms in this class
  • can have loops or recursive calls
40
Q

Logarithmic Complexity

A
  • Grows as log of size of one of its inputs
  • examples:
    • bisection search
    • binary search of a list
41
Q

Linear Complexity

A
  • Searching a list in sequence to see if an element is present
  • add char to a string
42
Q

Linear recursive complexity

A
  • complexity can depend on number of recursive calls
  • number of times around the loop is n
  • number of operations inside loop is a constant
43
Q

Log-Linear complexity

A
  • many practical algorithms are log-linear

- like the merge sort

44
Q

Polynomial Complexity

A
  • most are quadratic n^2
  • commonly for
    • nested loops
    • recursive functions calls
45
Q

o() for nested loops

(see file “example-analysing-complexity.py” in week-6.

A
  • computes n^2 very inefficiently
  • look at the ranges
  • nested loops, each iterating n times
46
Q

When input is a list

A

Must define what size of input means

  • previously it was the magnitude of a number
  • here it is the length of list
47
Q

Common Python functions

List

A

index, store, length append
=> O(1)
==, remove, copy, reverse, iteration, in list
=> O(n)

48
Q

Common Python functions

Dictionaries

A

Worst Case:
index, store, length, delete, iteration
=> O(n)
Avg case is however often O(1)

49
Q

Big Oh summary

compare efficiency of algortihms

A
  • notation that describes growth
  • lower order of growth is better
  • independent of machine or specific implementation
50
Q

Big Oh summary

use Big Oh

A
  • describe order of growth
  • asymptotic notation
  • upper bound
  • worst case analysis
51
Q

Notes on Big Oh

A

https://prod-edxapp.edx-cdn.org/assets/courseware/v1/b874db7cc14e722ce7b921d11d944026/asset-v1:MITx+6.00.1x+2T2019+type@asset+block/handouts_Big_O.pdf