Markov chains Flashcards
1
Q
What is a Markov Chain? What does it mean? How can we use it?
A
2
Q
How do you read this Probability mas function:
P(X =x)
A
3
Q
chain
A
discrete
4
Q
processes
A
continuous
5
Q
What does the Markov Property say?
A
-all you need to know is initial probability and the transition matrix
-the next state’s probability only depends on the current state
6
Q
What is phi?
A
phi is is the mass function of the initial probability
7
Q
What is the transition probabilities?
A
8
Q
What does it mean to be time homogeneous in the markov chain? Mathematically? What does this allow us to do?
A
changing states at time n is the same as changing states at time 0
-Pn(Xn+1|Xn)=Po(Xn+1|Xn)
the time when you change states does not matter