Chapter 4 Flashcards

1
Q

Define a finite markov chain

A

The process X is said to be a finite Markov chain if it satisfies the Markov property and the state space is finite. We will usually take
X = {1, …, N} or X = {0, 1, …, N}.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

what does p(i,j) denote

A

Probability of moving from state i to state j in 1 step. Probability of moving to state j given we are in state i

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is pn(i)

A

pn(i) = P(Xn = i)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define a stochastic matrix

A

A stochastic matrix is a square matrix whose columns are probability vectors. A probability vector is a numerical vector whose entries are real numbers between 0 and 1 whose sum is 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the distribution of a markov chain dependent on?

A

determined by its initial distribution and transition
probabilities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How to interpret matrix P to power of n

A

The i,j th entry of the matrix P ^N is probability of going from state i to state j in n steps

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are other names for the limiting distirbution

A

equilibrium, invariance, stationary distirbution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is two states both lead to each other - meaning

A

The relation ↔ is an equivalence relation, it partitions the
state space into disjoint sets called communication classes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Define irreducible and reducible

A

If there is only one communication class, the chain is said to
be irreducible. Otherwise, it is said to be reducible.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Is a random walk with reflecting/ absorbing boundaries irreducible or reducible

A

Reflecting- irreducible
Absorbing - reducible

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Define transient state

A

A state i is transient if there is non-zero probability that
the chain may never return to state i again - will only be visited a finite number of times. ie: A state i is transient iff there exists a state j for which i → j but j doesn’t lead to i.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the opposite of a transient state?>

A

Recurrent state or erogodic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Define a recurrent state

A

The chain will definitely return to state i again.
Hence, recurrent states will be visited infinitely often.
A special type of recurrent state is an absorbing state

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What can be told about the period of a state if states are in the same communication class

A

Their periods are the same if i communicates with j

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Define a aperiodic state

A

It has a period of 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How do we know when a markov chain has a unique existing stationary distribution

A

If the Markov chain {Xt } is aperiodic and irreducible, then it
has a unique stationary distribution π and Pn has a limit - perron frobenius theorem

17
Q

What does the ergodic theorem tell us?

A

Proportion of the number of visits to a state i in n steps will converge to the limiting distribution pi(i)

18
Q

Define in words the return times of an irreducible aperiodic markov chain

A

How long do we wait before returning to a state j - return time is the shortest time before returning