Chapter 4 Flashcards
Define a finite markov chain
The process X is said to be a finite Markov chain if it satisfies the Markov property and the state space is finite. We will usually take
X = {1, …, N} or X = {0, 1, …, N}.
what does p(i,j) denote
Probability of moving from state i to state j in 1 step. Probability of moving to state j given we are in state i
What is pn(i)
pn(i) = P(Xn = i)
Define a stochastic matrix
A stochastic matrix is a square matrix whose columns are probability vectors. A probability vector is a numerical vector whose entries are real numbers between 0 and 1 whose sum is 1
What is the distribution of a markov chain dependent on?
determined by its initial distribution and transition
probabilities
How to interpret matrix P to power of n
The i,j th entry of the matrix P ^N is probability of going from state i to state j in n steps
What are other names for the limiting distirbution
equilibrium, invariance, stationary distirbution
What is two states both lead to each other - meaning
The relation ↔ is an equivalence relation, it partitions the
state space into disjoint sets called communication classes
Define irreducible and reducible
If there is only one communication class, the chain is said to
be irreducible. Otherwise, it is said to be reducible.
Is a random walk with reflecting/ absorbing boundaries irreducible or reducible
Reflecting- irreducible
Absorbing - reducible
Define transient state
A state i is transient if there is non-zero probability that
the chain may never return to state i again - will only be visited a finite number of times. ie: A state i is transient iff there exists a state j for which i → j but j doesn’t lead to i.
What is the opposite of a transient state?>
Recurrent state or erogodic
Define a recurrent state
The chain will definitely return to state i again.
Hence, recurrent states will be visited infinitely often.
A special type of recurrent state is an absorbing state
What can be told about the period of a state if states are in the same communication class
Their periods are the same if i communicates with j
Define a aperiodic state
It has a period of 1
How do we know when a markov chain has a unique existing stationary distribution
If the Markov chain {Xt } is aperiodic and irreducible, then it
has a unique stationary distribution π and Pn has a limit - perron frobenius theorem
What does the ergodic theorem tell us?
Proportion of the number of visits to a state i in n steps will converge to the limiting distribution pi(i)
Define in words the return times of an irreducible aperiodic markov chain
How long do we wait before returning to a state j - return time is the shortest time before returning