Probability Flashcards
Define a sample space
For an experiment, the sample space is the set of all outcomes
Define an event
A subset of the sample space
Define equally likely for a finite sample space
P(A) = |A|/|Ω| for all A
Define a permutation
The number of ways to order disinguishable objects. For n objects, there are n! different ways.
Define the binomial coefficient
The number of orderings of m of one object and n - m of another is the binomial coefficient, which is equal to n!/[(m!)(n-m)!]
Define a probability space
A triple (Ω, F, P) where Ω is the sample space, F is a collection of events (subsets of Ω) and P is a function from F to the real numbers
Give the axioms of F
F1: Ω ∈ F
F2: If A ∈ F, then A complement ∈ F
F3 if Ai ∈ F for all i ≥ 1, then the union of Ai for i to infinity ∈ F
Give the axioms of P
P1: P(A) ≥ 0 for A in F
P2: P(Ω) = 1
P3: If Ai ∈ F for i ≥ 1and all Ai and Aj are disjoint for all i ≠ j, then the probability of the union of the individual events = the sum of the probabilities of the individual events.
Define conditional probability
The conditional probability of A given B is: P(A|B) = P(A∩B)/P(B)
Define a partition
{B_1, B_2, …} partitions Ω if Ω = U(B_i) and (B_i)n(B_j) = {} whenever i ≠ j.
Give the law of total probability
For a partition of Ω: B1, B2, … with PBi) > 0 for each i > 0, P(A) = P(A|Bi)xP(Bi), summed from i to n.
Give Bayes’ Theorem
For a partition of Ω: B1, B2, … with PBi) > 0 for each i > 0, then:
P(Bk|A) = P(A|Bk)xP(Bk)/P(A)
Define independence of two events
A and B are independent if P(A∩B) = P(A)P(B)
Define independence of a family of events
A family {Ai,i∈I} of events is independent if P[n(i∈J)Ai] = Product of Ai for all i∈J for all finite subsets J of I
Define a discrete random variable
A discrete random variable X on a probability space is a function X:Ω->R where:
Im(X) {X(ω),ω∈Ω}a is a countable set
For each X in R, {ω∈Ω:X(ω)=x} is in F
Define the probability mass function
The function R->[0,1]
Define the Bernoulli distribution
X~Ber(p) if P(X=1) = p and P(X=0) = 1 - p
Define the Binomial distribution
X~Bin(n,p) if P(X=k) = nCk.p^k.(1-p)^(n-k) for all k from 0 to n.
Define the Geometric distribution
X~Geom(p) if P(X=k) = p(1-p)^(k-1) for k>1.
Define the Uniform distribution
On a finite set {x1,x2,x3,…,xn} P(X=xi)= 1/n for i = 1, 2, 3, …, n
Define the Poisson distribution
X~Poisson(λ) if P(X=k) = (λ^k*e^-λ)/k!
Define expectation of a discrete variable
E[X] = x.P(X=x), summed for all x in the indexing set
Define independence of discrete random variables
Two discrete random variables X and Y are independent if for all x and y in the reals, the events {X=x} and {Y=y} are independent
Define joint distributions
Suppose X and Y are discrete random variables on (Ω,F,P). The joint probability mass function of X and Y is p_(x,y):R^2->[0,1]
Give Stirling’s approximation
n! ∼ (2π)^(1/2)xn^(n + 1/2)xe^(−n)
Define marginal distribution
For X, Y with joint probability mass function p_XY, the marginal distribution of X is given by the summing all the values of y in p_XY(x,y)
Define the conditional probability mass function
The conditional probability mass function of X given B is p_(X|B) (x) = P({X=x}∩B)/P(B)
Define conditional expectation
The conditional expectation of X given B, E[X|B] = x(p_x|B)x, summed for all x in ImX
Give the law of total probability for expectations
E[X] = E[X|Bi]P(Bi), summing i in I, defined whenever E[X] exists
Give the formula for linearity of expectation
E[X+Y] = E[X] + E[Y]
Define variance
The variance of a discrete random variable X is Var(X) = E[(X-E[X])^2], provided both of these expectations exist.
Define standard deviation
The square root of variance
Define covariance
For discrete random variables X and Y, cov(X,Y) = E[XY] - E[X]E[Y]
Define the probability generating function
G_x(s) = (p_k)(s)^k, summed for all k from one to infinity.
Define a random variable
A function X: Ω -> R such that {X ≤ x} := {ω ∈ Ω : X(ω) ≤ x}
Define a cumulative distribution function
The cumulative distribution function of a random variable X is the function F: R -> [0,1] defined by F(x) = P(X ≤ x)
Define a continuous random variable X
A random variable whose cdf can bewritten as the integral of fx(u)du from -infinity to x.
Define a probability density function of X
The function fx: R -> R in the integrand of a cdf of a continuous random variable
Define a continuous uniform distribution
X ~ Unif([a,b]) if X is a random variable with pdf fx(x) = 1/(b-a) for a ≤ x ≤ b
Define an exponential distribution
For λ > 0, X ~ Exp(λ) if X is a random variable with pdf fx(x) = λe^(-λx) for x ≥ 0
Define a normal distribution
For μ in the reals and σ^2 > 0, X ~ N(μ, σ^2) if X is a random variable with pdf fx(x) = 1/(2πσ^2)*exp(-(x-μ)^2/2σ^2)
Define the expectation of a continuous random variable
E[X] = integral of xfx(x)dx for all x from - infinity to infinity, provided that the integral is less than infinity.
Define the joint cdf of two continuous distributions
F_XY(x,y) = P(X≤x,Y≤y)
Define jointly continuous
X and Y are jointly continuous if F_X,Y(x,y) = the double integral of f_X,Ydydx, with each from -∞ to x and y respectively.
Define independence of jointly continuous random variables
Independent if f_X,Y(x,y) = f_X(x)f_Y(y)
Define a random sample
An independent selection of random variables from a distribution
Define the sample mean
The sum of the variables in the sample, divided by the number of variables in the sample
Give the weak law of large numbers
Probability of |(1/n).(sum of X from 1 to n) - μ| > ε converges to 0
Give Markov’s inequality
P(X ≥ t) ≤ E[X]/t
Give Chebyshev’s inequality
P(|Z - E[Z]| ≥ t) ≤ var(Z)/t^2