Week 3 - Probability Flashcards
what is probability
a probability is a number between 0 and 1 that measures the likelihood that some uncertain event will occur
sample set, event
the set of all possible outcomes
the subset of basic outcomes from the sample space
frequentist interpretation
probability is the proportion of times that an outcome would occur if we observed the random process an infinite number of times
ie if i were to roll a die 1 million times what proportion of the outcomes would be 2
law of large numbers
states that as more observations are collected, the proportion of occurrences with a particular outcome converges to the probability of that outcome
basic outcome
a possible outcome of a random process. the probability of all basic outcomes together is 1
venn diagrams
we use these to illustrate probabilities
disjoint / mutually exclusive
if two events cannot happen at the same time they are mutually exclusive - they have no basic out comes in common. (picture the 2 non overlapping circles)
Probability ( A and B ) = 0
non-disjoint
when 2 event have an overlapping basic outcome ie a card can be both a red and an ace
general addition rule
P ( A or B ) = P(A) + P(B) - P(A and B)
complementary events
2 mutually exclusive events whose probabilities sum to 1. the complement of A = A^c
flipping a heads or a tails are complementary events
complement rule
the probability of an event not occurring is 1 minus the probability that it does occur
P (not A) = P(A^c) = 1 - P(A)
independent processes
two processes are independent if knowing the outcome of one doesn’t tell us anything about the other
conditional probability
the probability of an event occurring, GIVEN THAT, another event has occurred
Ie what is the probability that the second flip is heads given that the first flip is heads
P(A/B) = P(A and B) / P(B)
marginal probability vs joint probability
marginal = when its based on a single variable
joint = when its based on 2 or more variables
Bayes Theorem
P(A/B) = P(B/A) - P(A) / P(B)
discrete random variable
if a random variable can only take a finite number of distinct values then it is discrete
probability distribution
all of the possible values of the variable as well as the associated probabilities
expected value of a discrete random variable
is a probability weighed sum of all values that the random variable can take ie. the mean
covariance meaning
it measures the strength of the linear relationship between 2 variables
for independent variables there is no covariance
continuous random variable
one that can take on a continuum of possible values.
eg. the height of a randomly chosen adult
or the time that it takes to complete a random task