2020 A2 Flashcards

1
Q

Contrast hard and soft allocation EM algorithms with one another and provide an example of
each.

A

Hard Allocation - associates each feature vector with only one missing label, being the most probable one (Viterbi re-estimation)

Soft allocation - probabilistically/partially associates the feature vector with multiple labels in accordance with one’s probabilistic knowledge of the situation. (Baum Welch re-estimation)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Suppose you have fitted a GMM to data, and someone then points out there should be temporal structure in your data. How can you use your GMM to initialize an HMM

A
  1. Create an HMM with the number of states equal to the mumber of mixture components in the GMM model.
  2. Assign the parameters found for each Gaussian mixture component to the Gaussian density functions associated with the stage of the HMM.
  3. State transition probabilities can be determined by analyzing the responsibilities of the data set.
  4. The HMM should now be initialised. Perform EM to optimised the HMM for the given data set.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Give a situation where a cyclic topology would be appropriate

A

Any repetitive pattern

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Suggest a sensible approach to initialise the training of a cyclic HMM from a data set.

A

(ans)
The initialization of the values can have a big impact on the performance of an HMM. So different methods can produce vastly different results.

Ones can use either the EM algorithm or the Viterbi Algorithm.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

It is common practice in machine learning to partition data sets into three portions for analysis.
One of these portions, the validation set, is used for selecting hyperparameters. Give the
hyperparameters for the following models:
i) logistic regression
ii) HMM.

A

logistic regression - Regulation parameter.

HMM - Number of states.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly