Evidence and Probabilities Flashcards

1
Q

What is:

Explicit Evidence Combination with Bayesian Probability?

A

Explicit Evidence Combination with Bayesian Probability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is:

Bayes’ Rule For Classification?

A
Bayes' Rule for Classification is an equation with on the left side the value we would like to calculate, the probability that the example takes on the value of class c given evidence E: p(C=c | E)
And on the right side the prior probability p(C=c) multiplied by the ratio of the likelihood of evidence E, given the class is c: p(E | C=c), to the likelihood of the evidence altogether: p(E) (which is p(E | C=c) + p(E | C≠c).
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is:

Naive Bayes?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is:
Conditional Indipendence?
And why is it important to Naive Bayes?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are:

Advantages and Disadvantages of Naive Bayes?

A

Very 𝘴𝘪𝘮𝘱𝘭𝘦 classifier

Efficient in terms of storage space and computation time

Performs well in many real-world applications

Non-accurate class probability estimation: the features are not completely 
independent, thus calculations will double count the evidence, however still in the right direction

Incremental learner: Induction technique that updates model one instance at a time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly