Markov Chains Flashcards
Define a state and state-space.
When do we call a transition matrix a stochastic matrix?
What does the entry pij represent?
The probabilitiy of transitioning from state i to state j after one time step.
Define a Markov chain.
What do we say as a shortened version of a Markov Chain?
What is one way to think of a Markov Chain?
Intuitively a Markov chain is a random process where only the current state of the process influences where the chain goes next.
What is another way to write property 1 in the following?
X0 has probability distribution defined by λ where λ = (λi : i ∈ I)
What is another way to write property 2 in the following?
Probability of a future event conditioned on the past and present is equal to the probability of the future event conditioned on the past.
Finish the following theorem.
Prove the following theorem.
What is another way to write: the future state of a Markov chain is only dependent on its current state?
What is the Markov Property theorem?
Prove the following theoem.
What is a stochastic matrix?
One where the rows sums are equal to 1 and has non-negative entires.
What do the following probabilities equal when P is a stochastic matrix that generates a Markov chain with entries: [[1-α, α], [β, 1-β]]?
Finish the following theorem.
Prove the following theorem.
What does pij(n) stand for?
The nth transition probability from i to j.