Order of Growth and Asymptotic Notation Flashcards

1
Q

what are the most commonly encountered orders of growth (in best to worst order)

A
constant
logarithmic
linear
linear logarithmic
quadratic
cubic
exponential
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

what does a linear logarithmic function look like

A

NlogN

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what are the best orders of growth for large input sizes

A

constant
logarithmic
linear
linearlogarithmic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

what does a constant run time mean

A

doesn’t change no matter what the input size is

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

example of a logarithmic run time algorithm

A

binary search

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

example of linear run time algorithm

A

for loop incs by constant amount each time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

example of linear logarithmic algroithm

A

mergesort

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

example of quadratic or cubic order or growth run time algorithm

A

nested loops

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

example of exponential order of growth run time algroithm

A

combinational search

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

example of an algorithm with constant run time

A

statements and assignments

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

how does binary search work

A

array is sorted
low high and mid index
continuously divides in half based on if the middle index is greater or less than the number being searched for until the middle index is the number being searched for

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

can algorithms have more than one order of growth

A

yes, may change depending on input size

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

which order of growth of an algorithm is the most important

A

the asymptotic one

ie the one as the function tends to infinity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

what are the three symbols used for describing asymptotic order or growth

A

Θ
O
Ω

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Big theta shows what order or growth

A

the asymptotic order of growth

used to classify algorithms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

when is big o used

A

to show upper bounds of algorithm

17
Q

when is big omega used

A

to show lower bounds of algotihm

18
Q

Every piece of code should have a big omega of

A

1

ie a constant running time should be the lower bound

19
Q

worst case analysis (NOT BIG O) can be found by using what type of input

A

the most difficult, largest input

20
Q

the average case analysis (NOT BIG ALPHA) can be found using what input

A

random

21
Q

what is the upper bound of an algorithm

A

the performance guarantee of the algorithm

ie. cannot do any worse than

22
Q

what is the lower bound of an algorithm

A

proof that the algorithm cannot perform any better than that

23
Q

where is the optimal algorithm

A

WHEN the lower bound = upper bound

you can prove that no other algorithm can run with a lower worse case running time than the one you have made

will grow the slowest as N tends to infinity

24
Q

what is the runtime for copying an array into a new spacewil

A

Θ(N)

N = size of the array

25
Q

putting in the easiest possible income to an algorithm will gove

A

lowest bound

26
Q

is it more common to count swaps or comparisons when coming up with the time complexity of a sorting or searching algorithsm

A

comparisons

This is because comparisons have similar cost to swaps and usually there are more comparisons than swaps

27
Q

O(f + g) =

A

O(max(f ,g))

28
Q

What does it mean when we say that an algorithm X is asymptotically more efficient than Y?

A

X will always be a better choice for large inputs

29
Q

Algorithm A and B have a worst-case running time of O(n) and O(logn), respectively. Therefore, algorithm B always runs faster than algorithm A.

True
False

A

False

The Big-O notation provides an asymptotic comparison in the running time of algorithms. For n < n0​​, algorithm A might run faster than algorithm B, for instance.