Part 1 (All but hashing) Flashcards

1
Q

What is an ADT?

A

An abstract data type is a specification of a data structure in terms of what we can do with it (its functionality) rather than what it actually is (its specific representation)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

The implementation affects (often dramatically) the … of the solution method, but not the …of the solution.

A

Efficiency

correctness

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is encapsulation?

A

Ideally the user of an ADT will not need to know, nor be able to influence, the actual representation of data

(hiding data from user and thus preventing alteration)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are some example ADTs?

A
  • List
  • Queue
  • Stack
  • Dictionary
  • Tree
  • Graph
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are some example data structures?

A
  • array
  • linked list
  • binary search tree
  • hash table
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are 3 main characteristics of an algorithm designed?

A
  1. Domain of definition: set of legal inputs
  2. Correctness: it gives correct output for each legal input. This depends on the problem we are trying to solve, and can be tricky to prove
  3. Resource use: usually computing time and memory space.
    • This depends on the input, and on the implementation (hardware, programmer skill, compiler, language, …).
    • It usually grows as the input size grows.
    • There is a tradeoff between resources (for example, time vs space).
    • Running time is usually more important than space use.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How running time scales with problem size

A

Table: Relative growth of running time T(n) when the input size increases from n = 10 to n = 1000 provided that T(10) = 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How problem size scales with running time

A

Table: The largest data sizes n that can be processed by an algorithm with time complexity Θ(f(n)) provided that T(10) = 1 minute

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Estimating running time: harder example

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

worst-case versus average-case analysis

A

a good worst-case bound is always useful, but it is just a first step and we should aim to refine the analysis for important algorithms. Average-case analysis is often more practically useful, provided the algorithm will be run on “random” data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

For large enough n, a linear time algorithm will always beat a quadratic time one

true/false?

A

true

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why do constants need to be taken into account at times?

A

An algorithm with running time 10−10n 2 is probably better in practice than one with running time 1010n, since the latter will eventually beat the former, but only on instances of size at least 1020, which is rarely met in practice

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What can be said about f is O(g)?

A

“f is Big-Oh of g”

if there is some C > 0 and some n0 ∈ N such that for all n ≥ n0, f(n) ≤ Cg(n)

f grows at most as fast as g

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What can be said about f is Ω(g)?

A

“f is big-Omega of g”

if g is O(f)

f grows at least as fast as g

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What can be said about f is Θ(g)?

A

“f is big-Theta of g”

if f is O(g) and g is O(f)

f grows at the same rate as g

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Rules for asymptotic notation

A

(irrelevance of constant factors)

if c > 0 is constant then cf is Θ(f)

(transitivity)

If f is O(g)/Ω(g)/Θ(g) and g is O(h)/Ω(h)/Θ(h), then f is O(h)/Ω(h)/Θ(h)

(sum rule)

If f1 is O(g1) and f2 is O(g2) then f1 + f2 is O(max{g1, g2})

(product rule)

If f1 is O(g1) and f2 is O(g2) then f1f2 is O(g1g2)

(limit rule)

Suppose that L := limn→∞ f(n)/g(n) exists

Then

  • if L = 0 then f is O(g) and f is not Ω(g);
  • if 0 < L < ∞ then f is Θ(g);
  • if L = ∞ then f is Ω(g) and f is not O(g)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Write f ≺ g if f is O(g) but f is not Θ(g), so g grows at a faster rate than f.

List runtimes

A

log n(log n)2√ nnn log nn(log n)2n2 n3 ≺ · · · ≺ (1.5)n2nn!nn

18
Q

f(n) occurring are log n, n, (n log n), n2 , n3 , 2 n , and each grows considerably faster than the previous one

which 2 have the smallest gap?

A

gap between n and n log n is the smallest

19
Q

What is a list and what are some of its basic operations?

A

A list (or sequence) is an ordered collection of items

basic operations:

  • return the size of the list; if this is zero, list is empty;
  • retrieve the item in a given position;
  • delete an item in a given position;
  • insert an item at a given position;
  • given the position of an item, find the position of the next item, if it exists (this gives an iterator over the data structure).
20
Q

How do size and capacity differ?

A

size is the number of elements in an array

capacity is the total number of places in an array

21
Q

In what runtime does retrieval, deletion and insertion occur, for a simple array?

A

Retrieval takes time Θ(1) - (this is notation for constant time)

Deletion takes time Θ(n − i) if the deleted element was in position i

Insertion takes time Θ(n − i) if the inserted element ends up in position i, not counting the case where we need to increase capacity

22
Q

statically allocated memory is from the…?

and

automatically allocated memory is from the…?

A

memory heap

and

memory stack (smaller)

23
Q

what does malloc do?

A

dynamically allocates memory from the heap

24
Q

what does realloc do?

A

resizes the memory required for an object

25
Q

what does free do?

A

prevents memory leak

26
Q

Suppose that we start from empty and insert n items into a list, at the end (so there is no moving of items needed) what is the time complexity if we do not account for capacity increase?

A

Θ(n)

27
Q

If we increase capacity by 1 every time the list is full and we need to insert another element what is the time complexity?

A

we perform work in Θ(1 + 2 + · · · + n), namely Θ(n2 ).

The same is true for any (“arithmetic”) increase by a constant

28
Q

If we double each time the list is full and we need to insert another element what is the time complexity?

A

we perform work in Θ(1 + 2 + 4 + · · · + 2k ), where 2 k is the largest power less than n. This simplifies to Θ(n)

(“the amortized cost of insertion is Θ(1)”)

note: The worst case for an insertion is Θ(n), but we can do n in a row at average cost Θ(1)

29
Q

Finding an item in an array-based list. if the array elements are general, we must simply search every element starting from the beginning.

Thus what is worst case and average case runtime?

A

It takes Θ(n) work in the worst case, and also on average if the data is random

30
Q

If the array elements are kept sorted (this is a different ADT, the SortedList) what type of search could be we use? What is it’s worst case runtime?

A

binary search

Θ(log n) in the worst case

31
Q

When we delete we should decrease the size of the list. But the capacity may stay much bigger than the size unless we explicitly reduce it.

What is one way in which we can do this reduction?

A

One way is to reduce the capacity by half when the list is less than 1/4 full. This increases the (amortized) cost of deletion by only a constant factor

32
Q

What are pros and cons of using a linked list?

A

Pros: no need to change array size; insert and delete operations are fast

Cons: random access to an element is slow, so retrieve and find operations take longer

33
Q

Linked list operations and how they’re performed

A

To retrieve/find an element, we must follow pointers from the beginning until we get to it

To insert an element at position i, we must reset the pointer from item i − 1 to point to the new element instead of to the former occupant x of position i, and make the new object point to x (which is now at position i + 1)

To delete an element, make the incoming pointer point to its successor instead (how to find that pointer?). If the node is at the beginning or end, a slight variation of this is needed

Remember to free the memory, or our old node may take up space forever

34
Q

Performance of basic list operations

A
35
Q

What is a stack?

A

A stack is a linear container that allows us to insert, delete and retrieve only at one end.

There is no general find operation

Intuitive idea: a stack of plates in a cafeteria. A stack is LIFO (last in, first out)

36
Q

What are the names of the operations for stacks?

A

The insert operation is usually called push

delete is pop

retrieve is peek

isempty tells us whether the stack is empty

new that creates an empty stack

note: Stacks are inherently tied to the idea of recursion

37
Q

What is a queue?

A

A queue is a linear container that allows us to insert at one end, and delete and retrieve only at the other end.

There is no general find operation

Intuitive idea: a queue of people waiting to buy tickets. A queue is FIFO (first in, first out)​

38
Q

what is a priority queue?

A

A priority queue is a container ADT where each element has a key (usually a real number) called its priority. There are operations allowing us to insert an element, and to find and delete the element of highest priority. A general find or delete or retrieve is not available

39
Q

Priority queues can be implemented in many ways what are some ways?

A

unsorted list, sorted list, binary heap, binomial heap

40
Q

convert this infix form to postfixform

1 + 2 − 3 ∗ 4

A

1 2 + 34 ∗ − postfix

Note: To evaluate a postfix expression, we can use a stack. Scan from left to right, and push each operand onto the stack. When we encounter an operator, pop the correct number of arguments from the stack, evaluate, and push the result back in. If this fails, an error has been made in the postfix notation

41
Q

Consider all possible outputs orders formed by pushing each of 1, 2, . . . , n onto a stack, and popping them all of

When n = 3 there is exactly one of the 6 possible orders that is impossible which one is it?

A

312

not possible push 1 push 2 push 3 pop 3 cannot pop 2 without popping 1