Markov chains Flashcards

1
Q

What is a Markov Chain? What does it mean? How can we use it?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How do you read this Probability mas function:

P(X =x)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

chain

A

discrete

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

processes

A

continuous

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does the Markov Property say?

A

-all you need to know is initial probability and the transition matrix

-the next state’s probability only depends on the current state

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is phi?

A

phi is is the mass function of the initial probability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the transition probabilities?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does it mean to be time homogeneous in the markov chain? Mathematically? What does this allow us to do?

A

changing states at time n is the same as changing states at time 0

-Pn(Xn+1|Xn)=Po(Xn+1|Xn)
the time when you change states does not matter

How well did you know this?
1
Not at all
2
3
4
5
Perfectly