Unit 4: Dynamic programming and dictionary data structures Flashcards

1
Q

Divide and conquer

Recurrence

A

A recurrence consists of:
- Initial conditions T(n) = cₙ for some small n
- Recursive equation that describes T(n) in terms of n and T(m) for some m<n.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Examples of recurrences:
Fibonacci numbers

A

F(0) = 0
F(1) = 1
F(n) = F(n - 1) + F(n - 2) for all n>1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Examples of recurrences:
Towers of Hanoi

A

T(0) = 0
T(n) = 2 . T(n - 1) + 1 for all n > 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Examples of recurrences:
quick sort

A

T(1) = 0
T(n) = n + T(n - 1) for all n > 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Master method applies to recurrences of the form:

A

T(n) = a T( n / b ) + f(n)
where a≥1 and b≥1 are constants and f(n) > 0 for almost all n.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Master method

Critical exponent

A

log_b a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Master theorem

g(n)

A

g(n) = n^{c_crit}

Informally, g(n) is related to the work of obtaining the subproblems.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Master theorem

<< and >>

A

<< and >> stand for smaller / larger by a polinomial cap O(n^ϵ) for some ϵ>0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Master theorem

3 Cases

A

Case 1:
f(n) = O(n^c), where c < c_crit
» T(n) = Θ(g(n))

Case 2:
f(n) =Θ(g(n) log^k n) for k≥0
» T(n) = Θ(f(n) log n)

Case 3:
f(n) = Ω(n^c), where c < c_crit
» T(n) = Θ(f(n))

T(n) = a T(n / b) + f(n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Dynamic Programming - Principals / Foundations

Dynamic programming

A

Divide the problem into a sequence (hierarchy) of (overlapping) subproblems and compute and store solutions for larger and larger subproblems with the help of the saved solutions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Dynamic Programming - Principals / Foundations

Bellman’s principal of optimality

A

An optimal policy (set of decisions) has the property that whatever the initial state and decisions are, the remaining decisions must constitute an optimal policy with the regard to the state resulting from the first decision.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Shortest path (negative edge weights)

Bellman’s equations

A

Let G=(V, E) be a directed graph without negative cycles and edge weights w : E → Q and t ∈ V .

The following holds for the length OPT(v) of a shortest (simple) v-t path P in G:
- OPT(t) = 0
- OPT(v) = min_{ u ∈ V\{u} } { w(v, u) + OPT(u) }, ∀ v ∈ V, v ≠ t.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Dictionary

A

A dictionary is a set of elements of a common type that supports the following operations:
- Search
- Insert
- Removal

Every element has a key k and some associated data.

We assume that k ∈ N.

Every element is uniquely identified by its key.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Search Trees

in general

A

Search trees exist in many variants and usually allow the efficient implementation of the following operations:
- Insertion of a new element.
- Searching an element (using the key).
- Removal of an element (using the key).
- Sequential iteration over all elements.
- Find a smallest / largest element.
- Find the next smallest / largest element.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Rooted trees

Depth of a node

A

The depth of a node is the number of edges on the path from the root to the node.

The depth is unique since such a path exists and is unique.

The depth of the root is 0.

The set of nodes having the same depth in a tree is also refered to as a level of a tree.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Rooted trees

Height of a (sub-)tree or node

A
  • The height h(Tᵣ) of a (sub-)tree Tᵣ with root r is the length of a longest path from r to any leaf of Tᵣ.
  • A tree consisting only of the root node has height 0.
  • The height of the empty tree is defined as -1.
17
Q

Rooted trees

Leaf

A

A node is a leaf if it does not have any children.

All other nodes are referred to as inner nodes.

18
Q

Tree traversal

inorder traversal

A

Consider recursively
- first the left subtree,
- then the root and
- then the right subtree (at every node).

input: root r of a (sub-)tree.
prodedure inorder(r):
- if r ≠ nil then:
- inorder(r.left)
- print(r) // Or any other operation on r
- inorder(r.right)

19
Q

Tree traversal

Run-time of inorder tree traversal

A

The run-time for a tree having n nodes is Θ(n) since there is one recursive call per node plus a maximum of 2n additional calls for empty subtrees and every call requires Θ(1).

20
Q

Tree traversal

Preorder traversal

A

Consider recursively:
- first the root,
- then the left subtree and
- then the right subtree.

21
Q

Tree traversal

Postorder traversal

A

Consider recursively:
- first the left subtree,
- then the right subtree,
- and then the root.

22
Q

Traversals: theorem

A

A binary tree can be uniquely identified given its inorder and preorder-traversal if and only if all nodes (keys) are pariwise distinct.

The same applies for the combination of inorder and postorder; however, the same does not apply for preorder and postorder.

23
Q

Binary search tree

A

A binary tree that satisfies the following **search tree property for every node ** x:
- if y is a node on the left subtree of x, then: y.key ≤ x.key
- if z is a node on the right subtree of x, then: z.key ≤ x.key

24
Q

Search

on a Binary search tree

A

input: root r of a binary search tree and key k.
output: tree node with k if it exists, otherwise nil.

procedure search (r, k)
- while r ≠ nil and r.key ≠ k do:
- if k < r.key then:
- rr.left
- else:
- rr.right
- return r

25
Q

Minimum

on a binary search tree

A

input: root r of a binary search tree and key k.
output: tree node with smallest key.

procedure minimum (r)
- if r ≠ nil then return nil;
- while r.left ≠ nil do
- rr.left
- return r

Idea: Continue to go down the left child until that is no longer possible.

26
Q

Maximum

on a binary search tree

A

input: root r of a binary search tree and key k.
output: tree node with largest key.

procedure maximum (r)
- if r ≠ nil then return nil;
- while r.right ≠ nil do
- rr.right
- return r

Idea: Continue to go down the right child until that is no longer possible.

27
Q

Successor

on a binary search tree

A

input: node n of a binary search tree.
output: successor of n according to inorder-traversal or nil.

procedure successor (n)
- if n.right ≠ nil then return minimum(n.right);
- else
- pn.parent
- while p≠nil and n=p.right do
- np
- pp.parent
- return p

28
Q

Predecessor

on a Binary Search Tree

A

input: node n of a binary search tree.
output: predecessor of n according to inorder-traversal or nil.

procedure predecessor (n)
- if n.left ≠ nil then return maximum(n.left);
- else
- pn.parent
- while p≠nil and n=p.left do
- np
- pp.parent
- return p

29
Q

Search Tree Operations

Worst-Case Run-time of Search Tree Operations

A
  • The worst-case run-time for all operations is Θ(h) where h is the height of the search tree.
  • That means that the run-time for each of these operations depends on the height of the tree, which can be O(n) in the worst case where all n nodes are arranged along a path in the search tree.
30
Q

Run-time of operations for a complete balanced tree

A
  • All inner nodes apart from those on the lower level have 2 children.
  • Such a tree has height h(T) = |log₂n | and therefore all operations (apart from traversals) require only O(log n) time.
31
Q

Sufficiently balanced search tree

A

Using suitable conditions and corresponding reorganization operations, the search tree is efficiently balanced to guarantee that its height remains logarithmic.

32
Q

3 Possibilities for sufficiently balanced search trees.

A
  • Height-balanced trees: The height of the two subtrees of every node differs by at most a constant.
  • Weight-balanced trees: The number of nodes in the subtrees of every node differs by at most a constant factor.
  • (a, b)-trees (2 ≤ a ≤ b): Every node (apart from the root) has between a and b children and all leaves have the same distance to the root.
33
Q

AVL-trees

A

The first description of these trees goes back to Adel’son-Vel’ski˘ı and Landis
in 1962.

Idea: Avoid degeneration by forcing a small difference between the height of the subtrees below every node.

Critical operations: Only insertion and removal are critical operations and require special treatment. All other operations work as before.

34
Q

AVL-trees

Balance

A

Balance of a node v: bal(v) = h₂ - h₁, where:
- h₁ - height of the left subtree of v.
- h₂ - height of the right subtree of v.

A node v is balanced, if bal(v) ∈ {-1, -, 1}

35
Q

AVL-condition

A

An AVL-tree is a binary search tree, where all nodes are balanced.

For every node v it holds that: the height of the left subtree differs from the height of the right subtree by at most 1.

The height of the empty tree is defined as -1.

36
Q

AVL-replacement

A

An AVL-replacement:
- is an operation (e.g. insertion, removal)
- which replaces a subtree T of a node
- by a modified (valid) AVL-tree T*,
- whose heigh differs from T by at most 1.

37
Q

AVL Analysis

A

Rotation: The time requirement for the execution of a single simple rotation or a single double rotation is constant.

Worst-case: At insertion and removal, we need (double) rations from the effected position to the root of the tree in the worst case.

Run-time: The run-time of insertion and removal are O( log n ).