Markov Chains Flashcards

1
Q

What is a Markov chain?

A

Probabilistic model of sequences. A 1st order Markov process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What can a Markov chain be thought of? What does this mean?

A

Generative model- A model of how sequences of objects can be produced in a random or probabilistic fashion.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is an nth order Markov process?

A

A probability distribution over sequences. The objects in the sequence belong to a state space. The probability of st being generated at time t is only determined by the n values before it in the sequence.
p(st | st-1, st-2,….s1) = p(st | st-1, st-2, …, st-n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a zeroth order Markov process?

A

A random sequence of objects

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is a homoogeneous Markov process?

A

Transition probabilities remain fixed over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Given a directed graph, how do we calculate p(s)

A

p(s) = p(s1)p(s2 | s1)p(s3|s2)…p(STOP)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is a normalisation condition in a Markov chain model?

A

Arrows leaving each state sum to one.

sum for all st of p(st | st-1) = 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is a transition matrix?

A

Transition probabilities written in a table. All numbers in each row sum to one.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What do we gain from unfolding in time?

A

Display of all the possible sequences of a fixed length T and their associated probabilities. Transition probabilities leaving each state do not sum to one because it doesn’t represent all possible sequences.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How can we write the sum over all possible sequences between s1 = n and sT+1 = STOP?

A

sum of all s_ns between 2 and T in S p(s1 = n and s2 and s3 … and sT)
= sum of all ….. in S p(s1 = n)p(s2 | s1 = n)p(s3 | s2)…p(STOP | sT)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How many operations when calculating unfolding in time?

A

(T-2)|S|^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Write a recursion relation to demonstrate unfolding in time

A

base case:
p(s2 and s1 = n) = p(s2 | s1 = n)p(s1 = n)
step case:
p(sT+1 and s1 = n) = sum of all sT in S p(sT+1 | sT)p(sT and s1 = n) for 2 <= t <= T

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How do we estimate transition probabilities for a Markov chain model?

A

Fit a Markov chain model to training data.
The simplest approach is to use frequency counts from the sequences in the training data. Set the transition probabilities to be proportional to these counts

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How can a higher order Markov process be mapped onto a 1st order process

A

Using an extended alphabet. E.g. a second order model with state space S = {a,b} is equivalent to a 1st order model with state space S = {aa, ab, ba, bb}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is a triphone Markov model?

A

A 3rd order phoneme Markov process mapped onto a 1st order process with an extended state space.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Why does a triphone model improve the performance of the hidden Markov model speech recognition methods?

A

The context of a phoneme can strongly influence the way that it is spoken.