7. Advanced Probability Concepts Flashcards
What is joint probability?
The probability of two events occurring together: P(A ∩ B).
What is marginal probability?
The probability of an event occurring regardless of other variables.
What is a moment generating function?
A function that generates moments (mean, variance, etc.) of a probability distribution.
What is the law of total expectation?
E[X] = E[E[X|Y]].
What is Chebyshev’s inequality?
A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.
What is entropy in information theory?
A measure of uncertainty in a probability distribution.
What is mutual information?
A measure of dependence between two random variables.
What is the Kullback-Leibler (KL) divergence?
A measure of how one probability distribution diverges from another.
What is Jensen’s inequality?
For a convex function g, E[g(X)] ≥ g(E[X]).
What is covariance?
A measure of how two random variables vary together.
What is joint probability?
The probability of two events occurring together: P(A ∩ B).
What is marginal probability?
The probability of an event occurring regardless of other variables.
What is a moment generating function?
A function that generates moments (mean, variance, etc.) of a probability distribution.
What is the law of total expectation?
E[X] = E[E[X|Y]].
What is Chebyshev’s inequality?
A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.
What is entropy in information theory?
A measure of uncertainty in a probability distribution.
What is mutual information?
A measure of dependence between two random variables.
What is the Kullback-Leibler (KL) divergence?
A measure of how one probability distribution diverges from another.
What is Jensen’s inequality?
For a convex function g, E[g(X)] ≥ g(E[X]).
What is covariance?
A measure of how two random variables vary together.
What is joint probability?
The probability of two events occurring together: P(A ∩ B).
What is marginal probability?
The probability of an event occurring regardless of other variables.
What is a moment generating function?
A function that generates moments (mean, variance, etc.) of a probability distribution.
What is the law of total expectation?
E[X] = E[E[X|Y]].
What is Chebyshev’s inequality?
A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.
What is entropy in information theory?
A measure of uncertainty in a probability distribution.
What is mutual information?
A measure of dependence between two random variables.
What is the Kullback-Leibler (KL) divergence?
A measure of how one probability distribution diverges from another.
What is Jensen’s inequality?
For a convex function g, E[g(X)] ≥ g(E[X]).
What is covariance?
A measure of how two random variables vary together.
What is joint probability?
The probability of two events occurring together: P(A ∩ B).
What is marginal probability?
The probability of an event occurring regardless of other variables.
What is a moment generating function?
A function that generates moments (mean, variance, etc.) of a probability distribution.
What is the law of total expectation?
E[X] = E[E[X|Y]].
What is Chebyshev’s inequality?
A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.
What is entropy in information theory?
A measure of uncertainty in a probability distribution.
What is mutual information?
A measure of dependence between two random variables.
What is the Kullback-Leibler (KL) divergence?
A measure of how one probability distribution diverges from another.
What is Jensen’s inequality?
For a convex function g, E[g(X)] ≥ g(E[X]).
What is covariance?
A measure of how two random variables vary together.