Probability Flashcards
Sample space and Events (Def)
The set of all possible outcomes in an experiment is called a SAMPLE SPACE (denoted S or Ω).
EVENTS are any possible subset of S
Categories of Sample Spaces
There are 3 categories of Sample Spaces:
- FINITE number of elements
- INFINITE COUNTABLE
- INFINITE UNCOUNTABLE
Independent events (Def)
Two events E1 and E2 are independent if the outcome of E1 does not affect the outcome of E2, and viceversa
Multiplication principle
Suppose we have n independent events E_1, E_2, … , E_n. If event E_k has m_k possible outcomes (for k = 1, 2, … , n), the there are
m_1 * m_2 * … * m_n
Possible ways for these events to occur
k-Permutations w/o repetition
A way of selecting k objects from a list of n.
- The order of selection matters
- Each object can be selected only once
k-Permutations w/ repetition
A way of selecting k objects from a list of n.
- The order of selection matters
- Each object can be selected more than once
Combinations w/o repetition
A way of selecting k objects from a list of n.
- The order of selection does NOT matter
- Each object can be selected only once
Aka n-choose-k
This is also the BINOMIAL COEFFICIENT
Combinations w/ repetition
A way of selecting k objects from a list of n.
- The order of selection does NOT matter
- Each object can be selected more than once
It’s aka the MULTISET COEFFICIENT
Combinations w/ repetition (ice cream example)
Definition of Probability
For a given experiment with Sample Space S, PROBABILITY is a real-valued function:
P: S -> [0, 1]
For each subset E that belongs to S, the function P assigns a number P(E), such that P(E) € [0, 1]
Axioms of Probability
The re are 4 axioms:
(1) For any event E subset of S, 0 <= P(E) <= 1
(2) P(S) = 1
(3) For two disjoint events (such that their intersection = 0), P(EuF) = P(E) + P(F)
(4) More generally, (3) works for n mutually exclusive events too
Conditional probability formula
Mutual independence (Def)
Events A, B and C are MUTUALLY INDEPENDENT if:
- P(A&B&C) = P(A)P(B)P(C)
And
- Pairwise independent
Law of Total Probability (Formula)
If {E_1, E_2, … , E_k} are a PARTITION of S, then for any event A:
Bayes’ rule (Formula)
Odds (Formula)
Historically, the likelihood of an event B has been expressed as a ratio between the prob of B and the prob of non-B
Odds w/ Bayes’ Theorem
Random Variable (Def)
A RANDOM VARIABLE is a function
X: S -> IR
For each element s of S, X(s) is a real number in IR
Range space / Support (Def)
The RANGE SPACE (or Support) R_x of a random variable X is the set of all possible realisations of X
( X(s) for every s € S )
Probability Mass Function (Def)
The PMF of a discrete random variable X is a function
f: R_x -> (0, 1]
Such that
f(x) = P(X=x) = p_x(x)
for each x € R_x such that
f(x) > 0
Σ_(x€R_x) f(x) = 1
P(X€A) = Σ_(x€A) f(x)
for some event A
Expected value (Formula)
Expected value (Formula)
Variance (Formula)
Cumulative Distribution Function (Formula)
Bernoulli distribution (Def)
An experiment that can take two values, 1 (success) and 0 (failure), with
P(1) = θ P(0) = 1-θ
Binomial distribution (Def)
Repeated experiment n times, with each experiment a Bernoulli variable
Poisson distribution (Def)
Describes the number of events occurring within a given interval, with rate λ.
It is commonly used to describe count data
Joint PMF (Formula)
Joint PMF (Key properties)