Markov Chain Flashcards

1
Q

What is a finite–space Markov Chain?

A

One where there is a difference there are discrete values or states” that the Markov chain can take”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is a unigram, bigram, trigram and a 4–gram?

A

Unigram – each term in the chain is random

Bigram – each term distribution depends only on the previous term

Trigram – each term depends on the previous 2 terms

4–gram – each term depends on the previous 3 terms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a Transition Matrix?

A

A matrix which has all the probabilities of transitioning to each state in every row

i.e. row 1 corresponds to the 1st state probabilities of moving to another state.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

After n passes of time, what is the transition matrix?

A

The original transition matrix (for one passing of time) to the power of n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is a limiting distribution?

A

One that satisfies That there is a constant transition matrix at all times

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is true for every limiting distribution?

A

That a limiting distribution is also stationary distribution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What must be satisfied in order to be a stationary distribution?

A

As the time leap tends to infinity, the transition matrix tends to a uniform matrix (it is equally likely to be in any of the states)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is true for every NxN transition matrix, P?

A

1 is always an eigenvalue

This is proved through definition of eigenvalue and the sum of all rows adding to 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

When is a process not regular ergodic for a stationary distribution?

A

Two stationary distributions

The stationary distribution depends on the initial condition

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

When can two states from a markov chain be said to communicate with each other?

A

When there is a path between the two states (both ways)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is a recurrent set?

A

A set where all members communicate with each other.

It is impossible to move outside the set

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is a transient state?

A

One outside of a recurrent set

If we start on this state, eventually the state will be left and never returned to.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is an irreducible Markov chain?

A

One where all states communicate with each other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the waiting time to get to a state?

A

the sum of:

All probabilities of entering state multiplied by 1/(1 + waiting time of that state)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the period of a state k?

A

The greatest common divisor of the set of probabilities of returning to the state k from the state k.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

When is a state periodic?

A

When a state has a period of 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

When is a Markov chain aperiodic?

A

When all states are aperiodic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

If a chain is irreducible and aperiodic, what is it also?

A

It is also regular ergodic and will have a limiting distribution

19
Q

How many eigenvalues of the transition matrix are there for a Markov chain with periodicity k?

A

K eigenvalues

20
Q

What are the magnitudes of the eigenvalues?

A

All have magnitude 1

21
Q

When is a distribution detailed balance with a transition matrix P.

A

πjPj,k = πkPk,j

22
Q

What is a birth–death process?

A

One where a number of items increase and decrease at a set rate?

23
Q

What n(t) for a birth–death process?

A

The number of items at a time instance t

24
Q

For a change in time dt (t+dt), what is the probability of having n items when only birth is considered?

A

in blue is probability of no birth. In red is probability of a birth

25
Q

What is the difference between continuous and discrete time?

A

Discrete time there is a transition matrix

In continuous time there’s the transition rate matrix, Q

26
Q

What is the transition rate matrix Q like?

A

dx(t)/dt = x(t)Q,

xn(t) = Pn(t),

The sum of all probability of each number sums to 1

Each row of Q sums to 0 (probability mass conserved)

27
Q

What is the probability of a certain number of items when death rate is introduced?

A

Blue is probability of nothing occurring

Red is a birth occurring

Green is a death occurring

28
Q

How can Brownian Motion be described?

A

As a random walk where each step is taken at a certain interval that tends to 0

29
Q

What is used in Brownian Motion?

A

Density of items rather than number

Items leave area (lowering density)

Or enter area (increasing density)

30
Q

What is used to investigate Brownian Motion?

A

Taylor Series Expansion to both changes in time and position f(x,t)

31
Q

What is the result?

A

the diffusion equation

32
Q

What are the three properties of a one–dimensional Wiener Process?

A

independence: paths between to points are independent of anything before that path

Stationarity: The distribution of a path is independent of the length of the path

continuity: Wt is a continuous function

Gaussianity: Wt is a gaussian

33
Q

What is the cross correlation between two different functions at points t and s?

A

2αmin(t,s)

34
Q

What is a simple extension to the Wiener Process?

A

Adding a drift term

35
Q

What are Monte–Carlo Methods?

A

A general class of approaches that rely on random samples

Often used when analytical approaches fail

used to estimate complicated integrals

Essentially uses histogram methods

36
Q

Rather than histograms, what can be done?

A

Sampling from a weighted function w(x)

37
Q

What should be minimised in Monto–Carlo methods?

A

The number of samples

The variance (standard deviation)

38
Q

What can be done to get a better set of samples or when we cant sample from p(x) so must use another pdf?

A

importance sampling

39
Q

What is the equation for importance sampling?

A

p(x)/q(x) is the importance of the drawn sample

40
Q

With biased sampling how does q(x) affect results?

A

The closer q(x) is to H(x) the better the bias does at reducing Variance

41
Q

What is rejection sampling?

A

Drawing a box

Sampling uniformly from a box

rejecting samples that don’t fall under a curve

42
Q

What is the problem with rejection sampling?

A

Very wasteful

Curse of dimensionality

43
Q

What is the Metropolis–Hastings Algorithm?

A

Using Markov chain theory to draw samples from a distribution p(x)

44
Q

When is a sample accepted?

A

when it is less than the previous sample with some probability alpha