Unit 4: Dynamic programming and dictionary data structures Flashcards
Divide and conquer
Recurrence
A recurrence consists of:
- Initial conditions T(n) = cₙ
for some small n
- Recursive equation that describes T(n)
in terms of n
and T(m)
for some m<n
.
Examples of recurrences:
Fibonacci numbers
F(0) = 0
F(1) = 1
F(n) = F(n - 1) + F(n - 2) for all n>1
Examples of recurrences:
Towers of Hanoi
T(0) = 0
T(n) = 2 . T(n - 1) + 1 for all n > 0
Examples of recurrences:
quick sort
T(1) = 0
T(n) = n + T(n - 1) for all n > 0
Master method applies to recurrences of the form:
T(n) = a T( n / b ) + f(n)
where a≥1
and b≥1
are constants and f(n) > 0
for almost all n
.
Master method
Critical exponent
log_b a
Master theorem
g(n)
g(n) = n^{c_crit}
Informally, g(n)
is related to the work of obtaining the subproblems.
Master theorem
<<
and >>
<<
and >>
stand for smaller / larger by a polinomial cap O(n^ϵ)
for some ϵ>0
Master theorem
3 Cases
Case 1:f(n) = O(n^c)
, where c < c_crit
» T(n)
= Θ(g(n))
Case 2:f(n) =Θ(g(n) log^k n)
for k≥0
» T(n)
= Θ(f(n) log n)
Case 3:f(n) = Ω(n^c)
, where c < c_crit
» T(n)
= Θ(f(n))
T(n) = a T(n / b) + f(n)
Dynamic Programming - Principals / Foundations
Dynamic programming
Divide the problem into a sequence (hierarchy) of (overlapping) subproblems and compute and store solutions for larger and larger subproblems with the help of the saved solutions.
Dynamic Programming - Principals / Foundations
Bellman’s principal of optimality
An optimal policy (set of decisions) has the property that whatever the initial state and decisions are, the remaining decisions must constitute an optimal policy with the regard to the state resulting from the first decision.
Shortest path (negative edge weights)
Bellman’s equations
Let G=(V, E)
be a directed graph without negative cycles and edge weights w : E → Q
and t ∈ V
.
The following holds for the length OPT(v)
of a shortest (simple) v-t
path P
in G
:
- OPT(t) = 0
- OPT(v) = min_{ u ∈ V\{u} } { w(v, u) + OPT(u) }
, ∀ v ∈ V, v ≠ t.
Dictionary
A dictionary is a set of elements of a common type that supports the following operations:
- Search
- Insert
- Removal
Every element has a key k
and some associated data.
We assume that k ∈ N
.
Every element is uniquely identified by its key.
Search Trees
in general
Search trees exist in many variants and usually allow the efficient implementation of the following operations:
- Insertion of a new element.
- Searching an element (using the key).
- Removal of an element (using the key).
- Sequential iteration over all elements.
- Find a smallest / largest element.
- Find the next smallest / largest element.
Rooted trees
Depth of a node
The depth of a node is the number of edges on the path from the root to the node.
The depth is unique since such a path exists and is unique.
The depth of the root is 0.
The set of nodes having the same depth in a tree is also refered to as a level of a tree.
Rooted trees
Height of a (sub-)tree or node
- The height
h(Tᵣ)
of a (sub-)treeTᵣ
with rootr
is the length of a longest path fromr
to any leaf ofTᵣ
. - A tree consisting only of the root node has height
0
. - The height of the empty tree is defined as
-1
.
Rooted trees
Leaf
A node is a leaf if it does not have any children.
All other nodes are referred to as inner nodes.
Tree traversal
inorder traversal
Consider recursively
- first the left subtree,
- then the root and
- then the right subtree (at every node).
input: root r
of a (sub-)tree.
prodedure inorder(r
):
- if r ≠ nil
then:
- inorder(r.left
)
- print(r
) // Or any other operation on r
- inorder(r.right
)
Tree traversal
Run-time of inorder tree traversal
The run-time for a tree having n
nodes is Θ(n)
since there is one recursive call per node plus a maximum of 2n
additional calls for empty subtrees and every call requires Θ(1)
.
Tree traversal
Preorder traversal
Consider recursively:
- first the root,
- then the left subtree and
- then the right subtree.
Tree traversal
Postorder traversal
Consider recursively:
- first the left subtree,
- then the right subtree,
- and then the root.
Traversals: theorem
A binary tree can be uniquely identified given its inorder and preorder-traversal if and only if all nodes (keys) are pariwise distinct.
The same applies for the combination of inorder and postorder; however, the same does not apply for preorder and postorder.
Binary search tree
A binary tree that satisfies the following **search tree property for every node ** x
:
- if y
is a node on the left subtree of x
, then: y.key ≤ x.key
- if z
is a node on the right subtree of x
, then: z.key ≤ x.key
Search
on a Binary search tree
input: root r
of a binary search tree and key k
.
output: tree node with k
if it exists, otherwise nil
.
procedure search (r, k
)
- while r ≠ nil
and r.key ≠ k
do:
- if k
< r.key
then:
- r
← r.left
- else:
- r
← r.right
- return r
Minimum
on a binary search tree
input: root r
of a binary search tree and key k
.
output: tree node with smallest key.
procedure minimum (r
)
- if r ≠ nil
then return nil
;
- while r.left ≠ nil
do
- r
← r.left
- return r
Idea: Continue to go down the left child until that is no longer possible.
Maximum
on a binary search tree
input: root r
of a binary search tree and key k
.
output: tree node with largest key.
procedure maximum (r
)
- if r ≠ nil
then return nil
;
- while r.right ≠ nil
do
- r
← r.right
- return r
Idea: Continue to go down the right child until that is no longer possible.
Successor
on a binary search tree
input: node n
of a binary search tree.
output: successor of n
according to inorder-traversal or nil
.
procedure successor (n
)
- if n.right ≠ nil
then return minimum(n.right
);
- else
- p
← n.parent
- while p≠nil
and n=p.right
do
- n
← p
- p
← p.parent
- return p
Predecessor
on a Binary Search Tree
input: node n
of a binary search tree.
output: predecessor of n
according to inorder-traversal or nil
.
procedure predecessor (n
)
- if n.left ≠ nil
then return maximum(n.left
);
- else
- p
← n.parent
- while p≠nil
and n=p.left
do
- n
← p
- p
← p.parent
- return p
Search Tree Operations
Worst-Case Run-time of Search Tree Operations
- The worst-case run-time for all operations is
Θ(h)
whereh
is the height of the search tree. - That means that the run-time for each of these operations depends on the height of the tree, which can be
O(n)
in the worst case where alln
nodes are arranged along a path in the search tree.
Run-time of operations for a complete balanced tree
- All inner nodes apart from those on the lower level have 2 children.
- Such a tree has height
h(T) = |log₂n |
and therefore all operations (apart from traversals) require onlyO(log n)
time.
Sufficiently balanced search tree
Using suitable conditions and corresponding reorganization operations, the search tree is efficiently balanced to guarantee that its height remains logarithmic.
3 Possibilities for sufficiently balanced search trees.
- Height-balanced trees: The height of the two subtrees of every node differs by at most a constant.
- Weight-balanced trees: The number of nodes in the subtrees of every node differs by at most a constant factor.
-
(a, b)-trees (2 ≤ a ≤ b): Every node (apart from the root) has between
a
andb
children and all leaves have the same distance to the root.
AVL-trees
The first description of these trees goes back to Adel’son-Vel’ski˘ı and Landis
in 1962.
Idea: Avoid degeneration by forcing a small difference between the height of the subtrees below every node.
Critical operations: Only insertion and removal are critical operations and require special treatment. All other operations work as before.
AVL-trees
Balance
Balance of a node v
: bal(v) = h₂ - h₁
, where:
- h₁
- height of the left subtree of v
.
- h₂
- height of the right subtree of v
.
A node v
is balanced, if bal(v) ∈ {-1, -, 1}
AVL-condition
An AVL-tree is a binary search tree, where all nodes are balanced.
For every node v
it holds that: the height of the left subtree differs from the height of the right subtree by at most 1.
The height of the empty tree is defined as -1.
AVL-replacement
An AVL-replacement:
- is an operation (e.g. insertion, removal)
- which replaces a subtree T
of a node
- by a modified (valid) AVL-tree T*
,
- whose heigh differs from T
by at most 1.
AVL Analysis
Rotation: The time requirement for the execution of a single simple rotation or a single double rotation is constant.
Worst-case: At insertion and removal, we need (double) rations from the effected position to the root of the tree in the worst case.
Run-time: The run-time of insertion and removal are O( log n )
.