18. Mark Models 2 Flashcards
What is likelihood?
A measure of how well a sample fits a statistical model, often confused with probability.
What is the Naïve Bayes model?
A classification model assuming conditional independence between observed features given the class.
In Naïve Bayes, what are the class and evidence variables?
The class C is the query variable, and O1…On are the evidence variables.
What are the four types of inference in temporal models?
Filtering, prediction, smoothing, and most likely explanation.
What does filtering compute?
P(s_t | o_0:t), the belief state at the current time given observations so far.
What does prediction compute?
P(s_t’ | o_0:t) where t’ > t, predicting future states.
What does smoothing compute?
P(s_t’ | o_0:t) where t’ < t, inferring past states given all evidence.
What is the most likely explanation?
Finding the sequence of states that best explains the observations.
What is Recursive Bayesian Estimation?
An algorithm updating the belief state over time using emission and transition probabilities.
What are the three fundamental HMM problems?
Likelihood (Forward Algorithm), Decoding (Viterbi Algorithm), and Learning (Baum-Welch Algorithm).
What does the Forward Algorithm solve?
Computes the likelihood P(O|M) of the observed sequence given the HMM.
What is the complexity of the Forward Algorithm?
O(TN^2), where T is the sequence length and N is the number of states.
What does the Viterbi Algorithm solve?
Finds the most probable sequence of hidden states that generated the observations.
What is the complexity of the Viterbi Algorithm?
O(TN^2), same as the Forward Algorithm.
What does the Baum-Welch Algorithm do?
Learns the HMM parameters (transition and emission probabilities) from data.
What family of algorithms does Baum-Welch belong to?
Expectation-Maximization (EM) algorithms.
What is a limitation of the Baum-Welch algorithm?
It may converge to local maxima and can overfit data.