Markov Chains Flashcards
1
Q
Define a time homogenous chain
A
pij(t) are the same for all t
2
Q
What is a recurrent state?
A
State of a markov chain the chain will return to with pr = 1
3
Q
what is an irreducible chain?
A
state j can be reached from any state i
4
Q
What is a non-null state?
A
States that are recurrent and if chain has a finite state space
5
Q
What is a periodic markov chain?
A
steps taken to return to state is a multiple of some integer
6
Q
Define an ergodic markov chain
A
Chain is ergodic if irreducible, aperiodic and all states are non-null and recurrent
7
Q
When is a distribution stationary
A
π = πP