Markov Chains Flashcards

1
Q

What is a Markov Chain?

A

A stochastic process where the future state depends only on the current state, not on the sequence of past states.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the Markov property?

A

The principle that the future state of a process is independent of past states given the present state.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the components of a Markov Chain?

A

States, transition probabilities, and an initial state distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a transition matrix?

A

A square matrix where each entry represents the probability of transitioning from one state to another.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are the properties of a transition matrix?

A

Rows sum to 1, entries are non-negative, and dimensions match the number of states.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a stationary distribution?

A

A probability distribution over states that remains unchanged under the Markov Chain transitions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How do you find the stationary distribution?

A

Solve for the eigenvector of the transition matrix corresponding to eigenvalue 1, ensuring probabilities sum to 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is an absorbing state?

A

A state that, once entered, cannot be left (its transition probability to itself is 1).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is a recurrent state?

A

A state that the Markov Chain is guaranteed to return to eventually.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is a transient state?

A

A state that the Markov Chain may leave and never return to.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is an ergodic Markov Chain?

A

A Markov Chain that is irreducible and aperiodic, ensuring it has a unique stationary distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What does it mean for a Markov Chain to be irreducible?

A

Every state can be reached from any other state, directly or indirectly.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is periodicity in a Markov Chain?

A

A state is periodic if it is only revisited at multiples of some integer greater than 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the steady state of a Markov Chain?

A

Another term for the stationary distribution where the chain stabilizes over time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is a time-homogeneous Markov Chain?

A

A Markov Chain where transition probabilities are constant over time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the difference between a Markov Chain and a Markov Process?

A

A Markov Chain has discrete states and time steps; a Markov Process can be continuous in time or space.

17
Q

How are Markov Chains used in real life?

A

Examples include weather prediction, queueing theory, stock market analysis, and Google PageRank.

18
Q

What is the Chapman-Kolmogorov equation?

A

A formula that relates the n-step transition probabilities to the 1-step transition probabilities.

19
Q

What is the limiting distribution of a Markov Chain?

A

The distribution of states as the number of transitions approaches infinity.

20
Q

What is the relationship between eigenvalues and Markov Chains?

A

The eigenvalue 1 corresponds to the stationary distribution, and other eigenvalues determine convergence speed.

21
Q

What is a hidden Markov model (HMM)?

A

A statistical model where the system being modeled is assumed to follow a Markov process with unobservable states.

22
Q

What is the difference between an absorbing and a recurrent state?

A

An absorbing state cannot be left once entered, while a recurrent state can be left but will eventually be revisited.

23
Q

What is an n-step transition probability?

A

The probability of transitioning from one state to another in exactly n steps.

24
Q

How do you determine if a Markov Chain is aperiodic?

A

Check that no state has periodic revisits only at multiples of some integer greater than 1.