Probability & Statistics Flashcards
Definition of a Sample Space
The sample space is the set of all possible outcomes for the experiment. It is denoted by S
Definition of an Event
An event is a subset of the sample space. The event occurs if the actual outcome is an element of this subset.
Definition of a Simple Event
An event is a simple event if it consists of a single element of the sample space S
Meaning of Disjoint events
We say two sets A and B are disjoint if they have no element in common, i.e, A ∩ B = {}
De Morgan’s laws
(A ∪ B)c = Ac ∩ Bc
(A ∩ B)c = Ac ∪ Bc
Kolgromov’s axioms for probability
a) For every event A we have P(A) >= 0,
b) P(S) = 1,
c) If A1, A2, …, An are n pairwise disjoint events
then
P(A1 U A2 U … U An) = P(A1) + P(A2) + … + P(An)
Complement Rule for Probability
If A is an event then
P(Ac) = 1 - P(A)
Probability of an Empty Set
P(∅) = 0
Probability of an Event Upper Bound
If A is an event then P(A) <= 1
Probability of a Subset
If A and B are events and A ⊆ B then
P(A) <= P(B)
Probability of a Finite Event
The probability of a finite set is the sum of the probabilities of the corresponding simple events.
Inclusion-exclusion for two events
P(A ∪ B) = P(A) + P(B) − P(A ∩ B)
Inclusion-exclusion for three events
P(A∪B∪C) = P(A)+P(B)+P(C)−P(A∩B)−P(A∩C)−P(B∩C)+P(A∩B∩C)
Ordered with replacement (repetition allowed)
n^r
Ordered without replacement (no repetition)
n!
(n−r)!
Conditional Probability
If E1 and E2 are events and P(E1) /= 0 then the conditional probability of E2 given E1, usually denoted by P(E2|E1), is
P(E2|E1) = P(E1 ∩ E2)
P(E1)
Unordered without replacement (no repetition)
nCr (n)
(r)
Defenition of Independence
We say that the events E1 and E2 are (pairwise) independent if
P(E1 ∩ E2) = P(E1)P(E2)
When are three events E1, E2, and E3 called pairwise independent
P(E1 ∩ E2) = P(E1)P(E2),
P(E1 ∩ E3) = P(E1)P(E3),
P(E2 ∩ E3) = P(E2)P(E3).
When are three events E1, E2, and E3 called mutually independent
P(E1 ∩ E2 ∩ E3) = P(E1)P(E2)P(E3)
When are two events E1 and E2 are said to be conditionally independent given an event E3
P(E1 ∩ E2|E3) = P(E1|E3)P(E2|E3)
Definition of Random Variable
A random variable is a function from S to R
Definition of Discrete Random Variables
A random variable X is discrete if the set of values that X takes
is either finite or countably infinite.
Definition of Probability Mass Functions
The probability mass function (p.m.f.) of a discrete random
variable X is the function which given input x has output P(X = x)
Sum of Probabilities for a Discrete Random Variable
The sum of the Outputs must equal 1
Definition of Expectation
If X is a discrete random variable which takes values x1, x2, x3, . . ., then the expectation of X (or the expected value of X) is defined by
E(X) = x1P(X = x1) + x2P(X = x2) + x3P(X = x3) + · · · .
Bound on Expectations of a Random Variable
m ≤ E(X) ≤ M
Expectation of a function
E( f(X) ) = f(x1)P(X = x1) + f(x2)P(X = x2) + f(x3)P(X = x3) + · · ·
Definition of Moments
The nth moment of the random variable X is the expectation E(X^n)
Definition of Variance
Var(X) = [x1 − E(X)]2P(X = x1) + [x2 − E(X)]2P(X = x2)
+ [x3 − E(X)]2P(X = x3) + …
Variance formula
Var(X) = E(X^2) − [E(X)]^2
Linear function of expectation
E(aX + b) = aE(X) + b
Linear function of variance
Var(aX + b) = a^2Var(X)
What is a Bernoulli(p) distribution:
It is where a random variable X only takes values 0 and 1
Bernoulli distribution Expectation and Variance
E(X) = p, Var(X) = p(1 − p)
What is Binomial distribution:
A discrete random variable X has the Binomial(n, p) distribution, denoted X ∼ Bin(n, p), if its p.m.f. is :
P(x =k) = nCk x p^k x (1-p)^n-k
Binomial distribution Expectation and Variance
E(X) = np, Var(X) = np(1 − p)
What is Geometric distribution:
A discrete random variable X has the Geometric(p) distribution, denoted X ∼ Geom(p), if its p.m.f. is :
P(X = k) = p(1 − p)^k−1
Geometric distribution Expectation and Variance
E(X) = 1/p
Var(X) = 1 − p/ p^2
What is Hypergeometric distribution
the hypergeometric distribution describes the probability of successes in draws, without replacement, from a finite population
P(X = k) = mCk x (n-m)C(l-k) / nCl
Hypergeometric distribution Expectation and Variance
E(X) = l x (m/n)
Var(X) = l x (m/n) x (n-m/n) x (n-l/n-1)
What is Negative binomial distribution
The negative binomial distribution models the number of failures in a sequence of independent and identically distributed Bernoulli trials before a specified number of successes occurs
P(X = k) = (k+r-1)C(r-1) x p^r x (1-p)^k
Negative Binomial distribution Expectation and Variance
E(T) = (1-p)r/p
Var(T) = (1-p)r/p^2
What is Uniform distribution
uniform distribution refers to a type of probability distribution in which all outcomes are equally likely
P(X=k) = 1/(n+1) if m<=k<=n+m
(discrete) Uniform distribution Expectation and Variance
where n = b-a, and m=a
E(X) = m +(n/2)
Variance = n(n+2)/12
What is Poisson distribution
Poisson distribution expresses the probability of a given number of events occurring in a fixed interval of time if these events occur with a known constant mean rate.
P(X=k) = (λ^k/k!) x e^-λ