18. Mark Models 2 Flashcards

1
Q

What is likelihood?

A

A measure of how well a sample fits a statistical model, often confused with probability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the Naïve Bayes model?

A

A classification model assuming conditional independence between observed features given the class.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

In Naïve Bayes, what are the class and evidence variables?

A

The class C is the query variable, and O1…On are the evidence variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the four types of inference in temporal models?

A

Filtering, prediction, smoothing, and most likely explanation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does filtering compute?

A

P(s_t | o_0:t), the belief state at the current time given observations so far.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What does prediction compute?

A

P(s_t’ | o_0:t) where t’ > t, predicting future states.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does smoothing compute?

A

P(s_t’ | o_0:t) where t’ < t, inferring past states given all evidence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the most likely explanation?

A

Finding the sequence of states that best explains the observations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is Recursive Bayesian Estimation?

A

An algorithm updating the belief state over time using emission and transition probabilities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are the three fundamental HMM problems?

A

Likelihood (Forward Algorithm), Decoding (Viterbi Algorithm), and Learning (Baum-Welch Algorithm).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What does the Forward Algorithm solve?

A

Computes the likelihood P(O|M) of the observed sequence given the HMM.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the complexity of the Forward Algorithm?

A

O(TN^2), where T is the sequence length and N is the number of states.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What does the Viterbi Algorithm solve?

A

Finds the most probable sequence of hidden states that generated the observations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the complexity of the Viterbi Algorithm?

A

O(TN^2), same as the Forward Algorithm.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What does the Baum-Welch Algorithm do?

A

Learns the HMM parameters (transition and emission probabilities) from data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What family of algorithms does Baum-Welch belong to?

A

Expectation-Maximization (EM) algorithms.

17
Q

What is a limitation of the Baum-Welch algorithm?

A

It may converge to local maxima and can overfit data.