Chapter 3 Flashcards
Markov process
Any process satisfying the Markov property
Markov chain
A Markov process in discrete time and with a discrete state space.
Irreducibility
A Markov chain is said to be irriducible if any state j can be reached from any other state i.
Period of a state
State i is said to be periodic with period d > 1 if a return to i is possible only in a number of steps that is a multiple of d.
Aperiod state
One that is not periodic.
State 4 assumptions made by the Markov Jump Chain
The holding time in each state is exponentially distributed
The transition intensities from each state are not time-dependent.
The parameter of this distribution varies only by state i, so that the distribution is independent of anything that happened prior to the arrival in current state i.
The destination of the jump on leaving state i is independent of holding time, and of anything that happened prior to the current arrival in state i.