Markov Chains in Discrete Time Flashcards

1
Q

Define a stochastic chain

A

Let Xn with n∈N0 denote random variables defined on probability space (Ω, F, P) taking values in a discrete space S.
Then the stochastic process X = (Xn : n∈N0) is called a stochastic chain (with time domain N0 and state space S).
If Xn = i, we say that the process is in state i at time n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Give the Markov property

A

P(X_(n+1) = j | X0=i0, …, Xn= in) = P(X_(n+1) = j | Xn= in)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

When is a Markov chain time homogenous?

A

If conditional probabilities do not depend on time index n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What should the row sum of a transition matrix equal?

A

1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Derive the Chapman-Kolmogorov equations of an m-step transition

A

P( X_(m+n) ) = pij^(m+n)
= P(i,j)^(m+n)
= sum(k in S) P(i,k)^(m) * P(k,j) ^(n)
= sum(k in S) P(Xm = k | X0 = i)*P(Xn = j | X0 = k)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

When is a communication class closed?

A

If it doesn’t allow access to states outside of itself

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

When is a communication class absorbing?

A

If the communication class consists only of one state

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

When is a Markov chain irreducible?

A

All states communicate with each other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Give the distribution of stopping time of first visit to a state

A
Fk(i,j) = pij if k=1
Fk(i,j) = sum(h≠j) pih * F(k-1)(h,j) if k≥2
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Give the probability of ever visiting a state

A

fij = pij + sum(h≠j) pih * fhj for all i,j in S

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Give P(Nj = m | x0 = i), where Nj is the number of visits to state j

A

P(Nj = m | x0 = i) = 1 - fjj, m=0

= fij * fij^(m-1) (1 - fjj), m≥1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

When is a state called recurrent?

A

fjj = 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What can be said about other states in a communication class if one state is recurrent?

A

All states in the communication class are recurrent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is P( Xn = j | x0 = i) as n goes to infinity if j is a transient state?

A

P( Xn = j | x0 = i) = 0 as n goes to infinity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

When is π a stationary distribution?

A

When πP = π

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Does a transient Markov chain have a stationary distribution?

A

No

17
Q

Give mean time of return for a recurrent state

A

mi = E( τi | x0 = i)

where τi = min{ n≥1 : Xn = i }

18
Q

When is a state positive recurrent?

A

mi < infinity

19
Q

How can you jump to the conclusion of positive recurrence?

A

If a Markov chain is irreducible and has finite state space S, the chain is positive recurrent

20
Q

How can you obtain the stationary distribution from the mean time of return?

A

πi = 1/mi

21
Q

How can you show aperiodicity?

A

If the j-th column of P^m is all positive, the Markov chain is aperiodic

22
Q

When is a Markov chain called ergodic?

A

If it is aperiodic, irreducible and positive recurrent