B06 Naive Bayes Flashcards

1
Q

What is Naive Bayes in the context of Machine Learning?

A

Machine learning approach that quantify’s the probability of events and how those probabilities change in the light of additional information.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is probabilistic learning?

A

Probabilistic learning deals with a set of learning methods which use the observed probabilities of the features within the labeled data to predict the most likely class for unlabeled data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

The Naïve Bayes algorithm is part of a family of approaches known as :

A

Probabilistic or bayesian learners.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

The ________ of an event is the probability that it DID NOT happen.

A

complement Because events cannot simultaneously happen and not happen, events are mutually exclusive with their complements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Joint Probability

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Conditional Probability (Bayes Theorem)

For dependent events A and B , the probability of event A,
given that event B occurred is described with a conditional
probability:

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is NAIVE Bayes?

A

Naïve Bayes is a simple application of Bayes’ theorem to
classification, with the “naïve” assumption that all features
in the dataset are equally important and independent
within a class (Class Conditional independence).

In other words, events are assumed to be independent so
long as they are conditioned on the same class value.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is Additive Smoothing?

A

Given an observation from a multinomial
distribution with trials, additive or Laplace smoothing is
used to smooth the data by adding a pseudocount ( alpha) to
the number of observed and unobserved cases in order to
change the expected probability so as to avoid zerofrequency
problems.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Strengths of Naive Bayes?

A
  • Simple, fast and effective.
  • Works well with noisy and missing data.
  • Useful for both continuous and discrete features.
  • Works well with both little or large training data.
  • Estimated probabilities are easy to obtain.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Weaknesses of Naive Bayes?

A
  • Assumes that features are equally important and independent.
  • Not suitable for datasets with many continuous features.
  • Correlated features degrade performance.
  • Estimated probabilities may seem less reliable than predicted classes
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Applications of Naive Bayes?

A
  • Junk email filtering (spam).
  • Network intrusion or anomaly detection.
  • Medical condition diagnosis.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly