Markov PCA Flashcards
Markov Chain
- Finite number of states
- Transition matrix dictates probability of variable to change.
- “Ability to forget the past”
- x-order Markov Chain considers previous x states (including itself)
Markov Chain 1-step
P(x_ = P(x1|x0) * P(x0)
Hidden Markov Model
- Current state hidden
- Emits a symbol when at a state with probability
- Exact state never know, only guess from output
- Underlying states of HMM are in a Markov Chain
HMM 1 step sequence
P(X) = E_s P(x|S)P(S)
Maximum Likelihood
Estimate T by maximising probability of data P(D;T) as a function of matrix T.
Good estimate, asymptotically consistent and efficient.
Max Likelihood equation
P(D;T)=P(x1) . nij Ti->j, n(i->j)
n(i->j) is number of transitions from i->j
PCA Description
Reduce dimensions of the matrix by keeping only principle data. Key principle: Maintain as much variance in data as possible whilst reducing dimensionality.
PCA Method:
Centre data around origin.
Calculate Covariance matric
Take eigenvectors of the largest eigenvalue in the covariance matrix to capture most data.
Take highest d eigvecs such that sum of d eigvecs is 0.95x total variance.
Eigenvector EigenValue
EigVec of a matrix is a vector that when multiplied by original matrix comes out parallel to direction is went in (scale factor - the eigenvalue)