Discrete-time Markov chains Flashcards
Define what it means that state j is accessible from state i (for i, j ∈ E)
We say that state j is accessible from state i, written i → j, if the chain may ever
visit state j, with positive probability, starting from i. In other words, i → j if there
exist m ≥ 0 such that pij (m) > 0
Define what it means that states i and j communicate (for i, j ∈ E)
States i and j communicate if i → j and j → i, written i ↔ j, i.e. if there exists
n, m ≥ 0 such that pij (n) > 0 and pji(m) > 0.
State the Markov condition
P(Xn = j|Xn−1 = i, Xn−2 = xn−2, . . . , X0 = x0) = P(Xn = j|Xn−1 = i)
Define what it means for P to be a stochastic matrix and prove that the transition matrix is a stochastic matrix.
- P has non-negative entries, pij ≥ 0 for all i, j.
- P has row sums equal to 1; sum of pij = 1 for all i.
pg 16 for proof (law of total probability)
Define the n step transition matrix
pij (n) = P(Xm+n = j|Xm = i)
State the the Chapman-Kolmogorov equations
pij (m + n) = sum_l (pil(m)plj (n) ) that is P_{m+n} = P_mP_n, and P_n = P^n pg 18
Write the marginal distribution of the chain at time n ∈ N_0
ν^(n)_i = P(Xn = i)
ν^(n) = ν^(0)P_n = ν^(0)P^n
Closed / Not closed
Finite / Infinite
Write out the class properties
pg 38
What is the value of the elements of a stationary distribution corresponding to transient states?
0
pg 53
Define what it means for a vector λ to be a distribution on E
∀ j ∈ E, λj ≥ 0, and Sum_j∈E (λj) = 1
What does it mean for λ to be invariant for P?
λP = λ
What is the stationary distribution of a Markov chain?
definition for stationary and invariant
Stationary distribution of a doubly stochastic matrix
pi_i = 1/n for n=|E|
When is X time reversible?
Set Y_n = X_{N-n}
If transition matrices of X and Y are the same.
Iff the detailed balance equations hold (pi_i p_ij = pi_j p_ji)
Show by induction that for a discrete Markov chain (Xn)n∈N0, we have
P(Xn+m = xn+m|Xn = xn, . . . , X0 = x0) = P(Xn+m = xn+m|Xn = xn)
PS1 (1)