Markov Chains Flashcards

1
Q

Define a state and state-space.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

When do we call a transition matrix a stochastic matrix?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does the entry pij represent?

A

The probabilitiy of transitioning from state i to state j after one time step.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define a Markov chain.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What do we say as a shortened version of a Markov Chain?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is one way to think of a Markov Chain?

A

Intuitively a Markov chain is a random process where only the current state of the process influences where the chain goes next.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is another way to write property 1 in the following?

A

X0 has probability distribution defined by λ where λ = (λi : i ∈ I)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is another way to write property 2 in the following?

A

Probability of a future event conditioned on the past and present is equal to the probability of the future event conditioned on the past.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Finish the following theorem.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Prove the following theorem.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is another way to write: the future state of a Markov chain is only dependent on its current state?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the Markov Property theorem?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Prove the following theoem.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is a stochastic matrix?

A

One where the rows sums are equal to 1 and has non-negative entires.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What do the following probabilities equal when P is a stochastic matrix that generates a Markov chain with entries: [[1-α, α], [β, 1-β]]?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Finish the following theorem.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Prove the following theorem.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What does pij(n) stand for?

A

The nth transition probability from i to j.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is Chapman Kolmogorov Equatiosn theorem?

20
Q

Prove the following theorem.

21
Q

If a Markov chain naturally breaks up into separate pieces what are the pieces called?

A

Communicating classes

22
Q

Define i leads to j.

23
Q

Define i communicates with j.

24
Q

Finish the following theorem.

25
Q

Prove the following theorem.

26
Q

Define when a communicating class is closed.

27
Q

Define when a state i is absorbing.

28
Q

Define when a class is called irreducible.

29
Q

Define hitting time.

30
Q

What does HA stand for?

A

The first time that the Markov chain hits the set A.

31
Q

When A is a closed class, what is hi(A) called?

A

The absorption property

32
Q

Define the mean time take for (Xn)n≥0 to reach A.

33
Q

What is the theorem about the vector of hitting probabilites?

34
Q

What is the theorem about the vector of mean hitting times?

35
Q

Prove the following theorem.

36
Q

Define fij(n).

37
Q

Define when a state i is recurrent.

38
Q

What is another way to interpret this definition?

39
Q

What is the lemma about the probability, starting from i, of eventually hitting j at least k times?

40
Q

Prove the following Lemma.

41
Q

What is the theorem about i being transient or recurrent?

42
Q

Prove the following theorem.

43
Q

Finish the following thereom.

44
Q

Prove the following theorem.

45
Q

Finish the following Lemma: Every recurrent class is … ?

A

Every recurrent class is closed.

46
Q

Prove the following Lemma.