Naive Bayes Flashcards

1
Q

Compare Naive Bayes to other methods (2)

A
  • Training is very fast
  • Naive Bayes computes its parameters in closed form by counting
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What’s the Naive bayes assumption?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What’s wrong with the naive bayes assumption?

A

The features might not be independent!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What’s the naive bayes recipe for closed form MLE?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does maximizing likelihood accomplish?

A
  • There is only a finite amount of probability mass (i.e. sum-to-one constraint)
  • • MLE tries to allocate as much probability mass as possible to the things we have observed…
  • …at the expense of the things we have not observed
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the different Naive Bayes models and what kind of features do they correspond to?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does Naive bayes do?

A

Provide a framework for generative modeling

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What’s the overall process for Naive bayes? (3)

A
  • Choose p(xm | y) appropriate to the data (e.g. Bernoulli for binary features, Gaussian for continuous features)
      1. Train by MLE or MAP 4.
  • Classify by maximizing the posterior
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What kind of model is logistic regression?

A

Discriminative classifier

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Describe discriminative classifier (3)

A

– Example: Logistic Regression
– Directly model the conditional: p(y|x)
– Learning maximizes the conditional likelihood

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Describe generative classifiers (4)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Compare naive bayes and logistic regression in terms of

  • error
  • model efficiency
  • assumptions
  • number of examples needed to estimate the parameters
A
  • If model assumptions are correct: Naive Bayes is a more efficient learner (requires fewer samples) than Logistic Regression
  • If model assumptions are incorrect: Logistic Regression has lower asymptotic error, and does better than Naïve Bayes
  • Naïve Bayes makes stronger assumptions about the data but needs fewer examples to estimate the parameters
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What do we know about the features of naive bayes?

A

Features x are assumed to be conditionally independent
given y. (i.e. Naïve Bayes Assumption)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What does the naive bayes assumption say, in words?

A

Features x are assumed to be conditionally independent
given y? (double check this)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Compare Naive bayes and logistic regression wrt assumptions made about the featuers

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Compare Naive bayes and logistic regression wrt learning (MAP Estimation of Parameters)

A
17
Q

Compare Naive bayes and logistic regression wrt Learning (Parameter Estimation)

A