Probability Theory Flashcards

1
Q

Conditional Probability

A

The conditional probability of A given B is defined as the quotient of the unconditional joint probability of A and B, and the unconditional probability of B. Conditional probabilities help reduce the sample search space.

Formula:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Bayes’ Rule

A

Bayes’ rule expresses a conditional probability P(A|B) in terms of the “opposite” conditional probability P(B|A). So Bayes’ rule is helpful whenever we are interested in P(A|B) but know P(B|A).

Formula:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Log Likelihood

A

Log likelihood is just the logarithm of the likelihood. It is useful for calculating the joint likelihood for a set of independent events for example, because repeated multiplication of small probabilities suffers from numerical underflow. As the logarithm is monotonically increasing, maximizing likelihood is equivalent to maximizing log likelihood.

Formula:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly