Discrete-time Markov chains Flashcards
Define what it means that state j is accessible from state i (for i, j ∈ E)
We say that state j is accessible from state i, written i → j, if the chain may ever
visit state j, with positive probability, starting from i. In other words, i → j if there
exist m ≥ 0 such that pij (m) > 0
Define what it means that states i and j communicate (for i, j ∈ E)
States i and j communicate if i → j and j → i, written i ↔ j, i.e. if there exists
n, m ≥ 0 such that pij (n) > 0 and pji(m) > 0.
State the Markov condition
P(Xn = j|Xn−1 = i, Xn−2 = xn−2, . . . , X0 = x0) = P(Xn = j|Xn−1 = i)
Define what it means for P to be a stochastic matrix and prove that the transition matrix is a stochastic matrix.
- P has non-negative entries, pij ≥ 0 for all i, j.
- P has row sums equal to 1; sum of pij = 1 for all i.
pg 16 for proof (law of total probability)
Define the n step transition matrix
pij (n) = P(Xm+n = j|Xm = i)
State the the Chapman-Kolmogorov equations
pij (m + n) = sum_l (pil(m)plj (n) ) that is P_{m+n} = P_mP_n, and P_n = P^n pg 18
Write the marginal distribution of the chain at time n ∈ N_0
ν^(n)_i = P(Xn = i)
ν^(n) = ν^(0)P_n = ν^(0)P^n
Closed / Not closed
Finite / Infinite
Write out the class properties
pg 38
What is the value of the elements of a stationary distribution corresponding to transient states?
0
pg 53
Define what it means for a vector λ to be a distribution on E
∀ j ∈ E, λj ≥ 0, and Sum_j∈E (λj) = 1
What does it mean for λ to be invariant for P?
λP = λ
What is the stationary distribution of a Markov chain?
definition for stationary and invariant
Stationary distribution of a doubly stochastic matrix
pi_i = 1/n for n=|E|
When is X time reversible?
Set Y_n = X_{N-n}
If transition matrices of X and Y are the same.
Iff the detailed balance equations hold (pi_i p_ij = pi_j p_ji)
Show by induction that for a discrete Markov chain (Xn)n∈N0, we have
P(Xn+m = xn+m|Xn = xn, . . . , X0 = x0) = P(Xn+m = xn+m|Xn = xn)
PS1 (1)
What’s an effective way to compute p_ij(n) ?
Diagonalize P through eigenvalue decomposition (PS1 e7)
what is f_00(n)?
The probability that the first return to 0 after starting from 0 is after n steps
Define what it means for a state to be recurrent
P(Xn = j for some n ∈ N|X0 = j) = 1
j ∈ E is recurrent if and only if Sum_n^∞ pjj (n) = ∞
fjj = 1
Define what it means for a state to be transient
j ∈ E is transient if and only if Sum_n^∞ pjj (n) < ∞
fjj < 1
pij (n) → 0 as n → ∞
What is a null and what is a positive recurrent state
A recurrent state i ∈ E is called null if µi = ∞ and positive (or non-null) if µi < ∞.
Define the period of a state
d(i) = gcd{ n : p_ii(n) > 0 }
Define the first passage time
T_j = min{n : X_n = j}
What is the return probability?
f_jj = P(T_j < inf | X_0=j) = sum_{n=1}^inf f_jj(n)
Consider a discrete-time homogeneous Markov chain on a countable state space E. Suppose that the Markov chain is irreducible, has a stationary distribution denoted by π and all states are recurrent. What are the values of the stationary distribution?
πi = µ_i ^{−1}
for all i ∈ E, where µ_i denotes the mean recurrence time for state i
Define the first passage probability
f_ij(n) = P(T_j = n | X_0=i)
What is the mean recurrence time for state i?
µ_i = E[T_i | X_0 = i] = sum_{n=1}^{inf} nf_ii(n)
for i recurrent and inf otherwise
Define ergodic state
Positive recurrent and aperiodic
What does it mean to say a class is closed? And irreducible?
Closed: impossible to leave the class Irreducible: all states communicate
What does it mean for a state to be aperiodic?
d(i) = 1 or doesn’t exist