Markov Chains Flashcards
Define a state and state-space.

When do we call a transition matrix a stochastic matrix?

What does the entry pij represent?
The probabilitiy of transitioning from state i to state j after one time step.
Define a Markov chain.

What do we say as a shortened version of a Markov Chain?

What is one way to think of a Markov Chain?
Intuitively a Markov chain is a random process where only the current state of the process influences where the chain goes next.
What is another way to write property 1 in the following?

X0 has probability distribution defined by λ where λ = (λi : i ∈ I)
What is another way to write property 2 in the following?

Probability of a future event conditioned on the past and present is equal to the probability of the future event conditioned on the past.
Finish the following theorem.


Prove the following theorem.


What is another way to write: the future state of a Markov chain is only dependent on its current state?

What is the Markov Property theorem?

Prove the following theoem.


What is a stochastic matrix?
One where the rows sums are equal to 1 and has non-negative entires.
What do the following probabilities equal when P is a stochastic matrix that generates a Markov chain with entries: [[1-α, α], [β, 1-β]]?


Finish the following theorem.


Prove the following theorem.


What does pij(n) stand for?
The nth transition probability from i to j.
What is Chapman Kolmogorov Equatiosn theorem?

Prove the following theorem.


If a Markov chain naturally breaks up into separate pieces what are the pieces called?
Communicating classes
Define i leads to j.

Define i communicates with j.

Finish the following theorem.


Prove the following theorem.


Define when a communicating class is closed.

Define when a state i is absorbing.

Define when a class is called irreducible.

Define hitting time.

What does HA stand for?
The first time that the Markov chain hits the set A.
When A is a closed class, what is hi(A) called?
The absorption property
Define the mean time take for (Xn)n≥0 to reach A.

What is the theorem about the vector of hitting probabilites?

What is the theorem about the vector of mean hitting times?

Prove the following theorem.


Define fij(n).

Define when a state i is recurrent.

What is another way to interpret this definition?


What is the lemma about the probability, starting from i, of eventually hitting j at least k times?

Prove the following Lemma.


What is the theorem about i being transient or recurrent?

Prove the following theorem.


Finish the following thereom.


Prove the following theorem.


Finish the following Lemma: Every recurrent class is … ?
Every recurrent class is closed.
Prove the following Lemma.

