Dynamic Programming Flashcards
Recursive Programming
Recurssive problems can always be solved using recursion. Example: Fibonacci
def F(n) return 1 if n>3 else F(n-1) + F(n-2)
Simple recurssive approaches are often very wasteful because we solve subproblems many times
an exponential number of times in the case of Fibonacci
Dynamic Programming is a algorithmic paradigm
We will use pairwise alignment of sequence data
Recurrsive programming
Many recurssive problems have recurrsive solutions. Breaking down into subprblems can lead to solution of overall problem
Subproblems on the boundaries of the problem space (the smallest ones) often have trivial solutions
Factorial computation can be solved by a recurssive equation
..
Dynamic programming is much faster than recurssive programmming
Dynamic programming is going to increase by linear to input value
Recurssive programming is going to increase exponentially to the input value
Dynamic programming saving the output you compute instead of re-computing it again
..
Challenging part of dynamic programming is how to divide it into subproblems
How and where you will save it into memory.
Fibonacci you save it into array.
Dynamic programming allows going into polynomial run time algorithm
Number of subproblems should grow polynomically
algorithmically the runtime wil grow proportionally tonumber of EDGES (not vertex)
V | < | E | < | V | choose 2
EDIT Distance
the mnimum number of mismatches across all possible aligments.
Want to try to come up with all possible alignments?
Better how to divide problem into subproblems
When thinking of subproblems think of how to reduce/reuse values
Prefix alignment allows for recursion
..
Backtracking
while filling the table, save which of the three possibilitieswe chose
- at the end, by following the arrows, we get an alignment
- a match when we got diagonally up
- a das
Full graph is called….
Dependency Graph
Waterfall Ordering
diagonal approach
Allows for parallel programming
How you order things can be important
For more information about waterfall ordering
Marc Snir Illinois
Ordering has to deal with memory too.
The way table was created was not necessary. Anytime if we are trying to compute, only need previous row don’t need the much older rows. To find edit distance just keep previous row.
If just do one row at a time we will have asymptotic runtime
Memoization
Dynamic programing is typicallly described in terms of multidimensional tables or arrays that are filled from one side to ther other. The table both gives the order and the memory.. But it not need be that way.
Memoization Dynamic programming can be implementated recurssively.
- use a recursive code, but simply save solutions to subproblems in the memory and look up the memory before the recursive call
- sometimes uses a data strcture other than an array (i.e. diciotnary)
- the order is defined by a bottom-up traversal of the subproblem DAG
Local vs Global Alignment
Global Alignment :
Penalize (beta)
Probability of having a insertion or deletion
Every time you add a gap you add a beta
.
Unrooted tree
undirected graph in which any two nodes are connected by exactly one path
Rooted Tree
Designating one node as the root and orientating all edges away form the root gives an undirected graph that repsrents a rooted tree
many problems have obvious easy recursive solutions
the solution to the full problem can be formulated as a function of solutions to set of subproblems, which depends on smaller subproblems
boundary conditions (base case)
Linearization of is any partial order of a full order that is compatible with that partial order
Child should come before the parent
Save results in the memory to reduce recurssive computation
Before do a sub problem, check the memory
If you know your subproblem ahead of time in advance…. (dependencies)
then you can design your program to be efficient