Markov Chain Flashcards
What is a finite–space Markov Chain?
One where there is a difference there are discrete values or states” that the Markov chain can take”
What is a unigram, bigram, trigram and a 4–gram?
Unigram – each term in the chain is random
Bigram – each term distribution depends only on the previous term
Trigram – each term depends on the previous 2 terms
4–gram – each term depends on the previous 3 terms
What is a Transition Matrix?
A matrix which has all the probabilities of transitioning to each state in every row
i.e. row 1 corresponds to the 1st state probabilities of moving to another state.
After n passes of time, what is the transition matrix?
The original transition matrix (for one passing of time) to the power of n
What is a limiting distribution?
One that satisfies That there is a constant transition matrix at all times
What is true for every limiting distribution?
That a limiting distribution is also stationary distribution
What must be satisfied in order to be a stationary distribution?
As the time leap tends to infinity, the transition matrix tends to a uniform matrix (it is equally likely to be in any of the states)
What is true for every NxN transition matrix, P?
1 is always an eigenvalue
This is proved through definition of eigenvalue and the sum of all rows adding to 1
When is a process not regular ergodic for a stationary distribution?
Two stationary distributions
The stationary distribution depends on the initial condition
When can two states from a markov chain be said to communicate with each other?
When there is a path between the two states (both ways)
What is a recurrent set?
A set where all members communicate with each other.
It is impossible to move outside the set
What is a transient state?
One outside of a recurrent set
If we start on this state, eventually the state will be left and never returned to.
What is an irreducible Markov chain?
One where all states communicate with each other
What is the waiting time to get to a state?
the sum of:
All probabilities of entering state multiplied by 1/(1 + waiting time of that state)
What is the period of a state k?
The greatest common divisor of the set of probabilities of returning to the state k from the state k.
When is a state periodic?
When a state has a period of 1
When is a Markov chain aperiodic?
When all states are aperiodic