Markov Chains Flashcards
1
Q
Define irreducible
A
Each state is reachable from any other state given enough time
2
Q
Define periodicity
A
A state i is said to be period with period d>1 if a return to that state is possible only in a number of steps that is a multiple of d
3
Q
Define aperiodic
A
A state is said to be aperiodic if it has a period d = 1
4
Q
if all the states are aperiodic then..
A
The Markov chain is aperiodic
5
Q
all the states have the same period or are all aperiodic then
A
The Markov chain is irreducible
6
Q
if all the states in an irreducible Markov chain have period d>1
A
we may describe the chain as periodic with period d