Lec 3 Flashcards
Sample space
Event
Probability
Complement
Sample space: set of all possible outcomes of an experiment
Event: any set of outcomes (from sample space) of interest (Eg: a red card, card of diamonds)
Probability (of an event): the relative freq of the set of outcomes (compromising the event) over an indefinitely large (infinite) # of repetitions of the experiment (trials)
Eg: In the long run, what portion of the time will you expect a diamond? 0.25
Complement: of an event A, the set of outcomes in the sample space that are not in the event A (Aka A’, Ā, Ac)
Union of two events
Union of more than 2 events
Intersection of 2 events
Intersection of more than 2 events
Mutual exclusive events
Union of two events: it means A or B(shown as A U B)
Union of more than 2 events: A1 or A2 or A2 … or Ak
Intersection of 2 events: it means A and B, or the outcomes that belong to both A and B
(shown as A ∩ B, AB, A and B)
Intersection of more than 2 events: A1 and A2 and A3… Ak
Mutual exclusive events: A and B cannot occur at the same time
Exhaustive
Partition
Independence
Independence formula
Exhaustive: at least one of them must occur (Eg when you role a die, one of {1,2,3,4,5,6} must occur)
Partition: how mutually exclusive events are organized in the sample space (image)
Independence: knowing one event happened doesn’t change the probability of the other event
Independence formula: P(AB) = P(A)P(B)
Formula for A U B
Formula for A U B, if A and B are mutually exclusive
Are mutually exclusive events independent?
What happens when we sum up events that are a partition of the sample space?
Formula for complementary
Formula for A U B: P(A U B) = P(A) + P(B) – P(AB)
Formula for A U B, if A and B are mutually exclusive: P(AUB) = P(A) + P(B)
Is mutually exclusive event independent? No
What happens when we sum up events that are a partition of the sample space?
P(S) = sum of P(Ai to Ak) = 1
Formula for complementary: P(A’) = 1 – P(A)
Formulas:
The conditional probability of B given A
The conditional probability of B given A if A and B are independent
The conditional probability of A given B
General formula for P(AB) or 2 ways to express P(AB)
Formula to infer independence
Formulas:
The conditional probability of B given A: P(B|A) = P(AB)/P(A)
The conditional probability of B given A if A and B are independent: P(B|A) = P(A) P(B)/ P(A) = P(B)
The conditional probability of A given B: P(A|B) = P(AB)/P(B)
General formula for P(AB) or 2 ways to express P(AB): P(AB) = P(A) P(B|A) = P(A|B) P(B)
Independence: P(AB) = P(A)xP(B)
Bayes Theorem
How many formulas
Sensitivity
Specificity
High sensitivity
High specificity
SPIN
SNOUT
Bayes’ therorem: helps us determine P(A|B) when we know the probability of P(B|A), P(A), and P(B)
Bayes’ theorem formula: there’s 3
Sensitivity: if you have the disease, what is the prob of +ve test
Specificity: if you do not have the disease, what is the prob of -ve test
high sensitivity: if you have disease, you have high prob of a +ve result
If you have a -ve result on a test with HIGH sensitivity, there’s a high chance you DON’T have the disease (good for ruling out)
Specificity: no disease, high chance of -ve result
So, a +ve result means that you are very likely to have the disease
(NOTE: In Epi, we want to know, given the test result, do we have covid/disease? We can determine this with SPIN and SNOUT)
SPIN: specificity = rule in
SNOUT: Sensitivity = rule out
Random variable
Capital letters (X,Y)
Small letters (x,y)
Discrete random variable
Examples
Continuous random variable
Examples
Random variable: it assigns a real number to a point in the sample space
Capital letters (X,Y) = random variables
Small letters (x,y) = actual values
Discrete random variable: it is countable
Eg: # of heads from 4 coin tosses, # of particles from a radioactive source in 1min, # hospital admissions
Continuous random variable: not countable, usually measured (eg height, weight)
Probability mass function
Formula
Variable type
2 conditions for pmf
E
Probability mass function (pmf)
In a sample space, there is a discrete random variable “X”. The pmf gives the probability of X is EQUAL to a value “x”
(Denoted by small letter f(x)
Formula: f(x) = P(X = x)
Variable type: Only for discrete variables
2 conditions of pmf: pmf (a probability) is a value b/w 0 an 1; summation of all probabilities in the sample space = 1
The cumulative distribution function (CDF)
Formula
Cumulative distribution function: is the probability that a random variables “X” has a value LESS THAN or EQUAL to “x”
Denoted by capital letter F(x)
Formula: F(x) = P (X ≤ x)
Variable type: discrete and cont random v
How to use pmf to find the expected value or mean for random variable X
How to use pmf to find the VARIANCE for random variable X (2 formulas)
Use pmf to find the expected value or mean for random variable X
(Eg: What is the expected value of a role of a fair die µ = E(x) = (1)1/6 + (2)1/6 + (3)1/6 + (4)1/6 + (5)1/6 + (6)1/6 = 3.5
It’s the avg of #s 1 to 6
Use pmf to find the VARIANCE for random variable X (2 formulas)
Factorial
Combination
formula
Factorial: n!
Combination: The number of ways to choose “x” items in a set of n items
(eg how many ways to choose 12 juries in a set or pool of 20 people)
Define binomial distribution
Variable type
Notation and explanation for binomial distribution
Formula
pmf for binomial distribution formula
The pattern on graphs - if p increases, what happens to the distribution
mean of binomial distribution
variance of binomial distribution
Binomial distribution: The probability of “SUCCESS” or “FAILURE” outcomes in an experiment that is repeated for “n” trials
Variables: discrete
Notation: X~ Bin(n,p)
The random variable “X” is in a binomial distribution with “n” experimental trials and “p” probability
pmf for binomial distribution formula: (image)
Graphs: As probability goes up (p = 0.5 -> 0.75), it becomes more skewed to the left (the tail is on the left side)
mean of binomial distribution: E(X) = µ = np
variance of binomial distribution: σ² = np (1 – p)
Define poisson distribution
Variable type
pmf for random variable X formula
mean for poisson distribution
variance for poisson distribution
Poisson distribution: determines the probability of an event happening over a specified period of time
Variables: discrete
pmf for random variable X formula
e = 2.72
mean for poisson distribution: E(X) = µ = λ
variance for poisson distribution: σ² = λ
Probability density function (pdf)
variable type:
pdf formula: what it does
Probability density function (pdf): the area under the curve b/w 2 values (eg a and b) on the horizontal axis is equal to the probability that “X” (the random variable) is b/w those 2 values
Variable type: continuous
pdf formula: integrates and computes the area under the curve
mean for pdf …
variance for pdf …
Explain X ~ N (µ, σ2)
variable type for normal distribution
Notation for Standard normal distribution
Process to transform NORMAL distribution to STANDARD NORMAL distribution
X ~ N (µ, σ2): “X” (random variable) is located in a normal or Gaussian distribution that has a mean µ and variance σ2
variable type: continuous
pdf for normal distribution formula …
Standard normal distribution: X ~ N(0,1)
pdf for standard normal distribution formula …
Process to transform NORMAL distribution to STANDARD NORMAL distribution: (image)