Probability Theory Flashcards
Conditional Probability
The conditional probability of A given B is defined as the quotient of the unconditional joint probability of A and B, and the unconditional probability of B. Conditional probabilities help reduce the sample search space.
Formula:
Bayes’ Rule
Bayes’ rule expresses a conditional probability P(A|B) in terms of the “opposite” conditional probability P(B|A). So Bayes’ rule is helpful whenever we are interested in P(A|B) but know P(B|A).
Formula:
Log Likelihood
Log likelihood is just the logarithm of the likelihood. It is useful for calculating the joint likelihood for a set of independent events for example, because repeated multiplication of small probabilities suffers from numerical underflow. As the logarithm is monotonically increasing, maximizing likelihood is equivalent to maximizing log likelihood.
Formula: