Markov Chains Flashcards
What is a Markov Chain?
A stochastic process where the future state depends only on the current state, not on the sequence of past states.
What is the Markov property?
The principle that the future state of a process is independent of past states given the present state.
What are the components of a Markov Chain?
States, transition probabilities, and an initial state distribution.
What is a transition matrix?
A square matrix where each entry represents the probability of transitioning from one state to another.
What are the properties of a transition matrix?
Rows sum to 1, entries are non-negative, and dimensions match the number of states.
What is a stationary distribution?
A probability distribution over states that remains unchanged under the Markov Chain transitions.
How do you find the stationary distribution?
Solve for the eigenvector of the transition matrix corresponding to eigenvalue 1, ensuring probabilities sum to 1.
What is an absorbing state?
A state that, once entered, cannot be left (its transition probability to itself is 1).
What is a recurrent state?
A state that the Markov Chain is guaranteed to return to eventually.
What is a transient state?
A state that the Markov Chain may leave and never return to.
What is an ergodic Markov Chain?
A Markov Chain that is irreducible and aperiodic, ensuring it has a unique stationary distribution.
What does it mean for a Markov Chain to be irreducible?
Every state can be reached from any other state, directly or indirectly.
What is periodicity in a Markov Chain?
A state is periodic if it is only revisited at multiples of some integer greater than 1.
What is the steady state of a Markov Chain?
Another term for the stationary distribution where the chain stabilizes over time.
What is a time-homogeneous Markov Chain?
A Markov Chain where transition probabilities are constant over time.
What is the difference between a Markov Chain and a Markov Process?
A Markov Chain has discrete states and time steps; a Markov Process can be continuous in time or space.
How are Markov Chains used in real life?
Examples include weather prediction, queueing theory, stock market analysis, and Google PageRank.
What is the Chapman-Kolmogorov equation?
A formula that relates the n-step transition probabilities to the 1-step transition probabilities.
What is the limiting distribution of a Markov Chain?
The distribution of states as the number of transitions approaches infinity.
What is the relationship between eigenvalues and Markov Chains?
The eigenvalue 1 corresponds to the stationary distribution, and other eigenvalues determine convergence speed.
What is a hidden Markov model (HMM)?
A statistical model where the system being modeled is assumed to follow a Markov process with unobservable states.
What is the difference between an absorbing and a recurrent state?
An absorbing state cannot be left once entered, while a recurrent state can be left but will eventually be revisited.
What is an n-step transition probability?
The probability of transitioning from one state to another in exactly n steps.
How do you determine if a Markov Chain is aperiodic?
Check that no state has periodic revisits only at multiples of some integer greater than 1.