Markov Chains Flashcards

1
Q

How do you find two-step probabilities?

A

P*P

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is an accesible state?

A

State j is said to be accessible from state i if, when being in state i there is a positive probability of going to state j in a finite number of steps. We write i → j.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a class?

A

The state space S can be divided into classes of states where all states in a class communicate with each other and not with any state in any other class.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is an irreducble markov chain?

A

If there is only on class, i.e. all states communicate, the Markov chain is called irreducible.
i.e. you are able to return to any state.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is an absorbing state?

A

States that are impossible to get out of

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a recurrent and transient states?

A

For a Markov chain let Ri denote the probability that starting from state i, the process will ever return to state i. We say that state i is
* Recurrent if Ri = 1 (guaranteed to return)
* Transient if Ri < 1 (not guaranteed to return)
If a state is recurrent, in the long run the process will return to the state infinitely many times. For a transient state the process will only return a finite number of times.

  • Transient states are only visited a finite number of times.
  • A finite-state Markov chain has at least one recurrent state.
  • If one state in a class is recurrent, all states in the class are.
  • If one state in a class is transient, all states in the class are.
  • In an irreducible Markov chain all states are either recurrent or transient.
  • In a finite irreducible Markov chain, all states are recurrent.
  • Once a Markov chain enters a recurrent class, it will remain in that class forever.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is a positive recurrent markov chain?

A

For a Markov chain with finite state space, all recurrent states are positive recurrent.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the period in a markov chain?

A

The period of a state i in a Markov chain is the greatest common divisor. If the period is 1, the state is called aperiodic. All states in a class have the same period.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What characteristics does a markov chain need to have steady state probabilities?

A
  • Irreducible (one state)
  • Positive recurrent (recurrent, will return)
  • Aperiodic (gdc = 1)

For finite space we only need irreducible and aperiodic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How do we find steady state probabilities?

A

Π = P^T* Π

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Continious markov chains

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How do you prove some process is a markov chain?

A

Does not need previous info.
X_n+1depends only on Xn.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is a birth and death process?

A
  • Only able to move up in steps of one
  • Only move down in steps of one.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly