Final Exam [Bulk Review 1] Flashcards
Exam concepts 1 (Axioms of Probability) through 4 (Distributions)
Definition of independent events?
Two events, A and B, are independent if the occurrence of one does not affect the probability of the other occurring.
P(A and B) of independent events?
P(AandB) = P(A) × P(B)
Events A and B are independent if . . .
P(A and B) = P(A) × P(B)
P(A | B) = P(A)
P(B | A) = P(B)
Definition of dependent events?
Two events, A and B, are dependent if the occurrence of one event affects the probability of the other. Commonly, P(B∣A) represents the probability of B given that A has occurred.
P(A and B) of dependent events?
P(A and B) = P(B ∣ A) × P(A)
Definition of mutually exclusive events?
Two events, A and B, are mutually exclusive (or disjoint) if they cannot both occur at the same time.
P(A and B) of mutually exclusive events?
P(AandB) = 0
P(AorB) of mutually exclusive events?
P(AorB) = P(A) + P(B)
Definition of non-mutually exclusive events?
Two events, A and B, are non-mutually exclusive if they can both occur at the same time.
P(AorB) of non-mutually exclusive events?
P(AorB) = P(A) + P(B) − P(AandB)
Definition of conditional probability?
The probability of one event occurring given that another event has already occurred.
P(B given that A)?
P(B∣A) = P(AandB) / P(A)
What is Baye’s Theorem for P(B∣A)?
P(B∣A) = P(A∣B )× P(B) / P(A)
P(A given that B)?
P(A∣B) = P(AandB) / P(B)
When working conditionally it is sometimes easier to calculate P(A and B) as . . .
P(A|B) ⋅ P(B) OR P(B|A) ⋅ P(A)
Can events be independent and mutually exclusive?
NO unless one of the events has a zero probability (i.e., one of the events cannot occur). This is because mutually exclusive events can never occur together, whereas independent events can.
When asked for P(A or B) of independent events?
P(AorB) = P(A) + P(B) − P(AandB)
Because independent events are NOT mutually exclusive
Addition Rule for . . .
Mutually Exclusive Events:
Non-Mutually Exclusive Events:
Mutually Exclusive Events:
P(AorB) = P(A) + P(B)
Non-Mutually Exclusive Events:
P(AorB) = P(A) + P(B) − P(AandB)
Multiplication Rule for . . .
Independent Events:
Dependent Events:
Independent Events:
P(AandB) = P(A) × P(B)
Dependent Events:
P(AandB) = P(A) × P(B∣A)
If S is the sample space for any event A⊆S . . .
A ⋂ Ac = Ø, they are _________
A ⋃ Ac = S, they are _________
disjoint, partitions
Demorgan’s Law
(A ⋃ B)c = Ac ⋂ Bc
(A ⋂ B)c = Ac ⋃ Bc
Inclusion-Exclusion Principle
P(A ⋃ B) = P(A) + P(B) - P(A ⋂ B)
If A and B are disjoint events then,
P(A ⋃ B) = _____
P(A) = _______
P(A ⋂ B) = ____
P(A) + P(B)
1 - P(Ac)
Ø
Events A and B are independent if . . .
P(A ⋂ B) = P(A) ⋅ P(B)
P(A | B) = P(A)
P(B | A) = P(B)
Conditional Probability
P(A | B) = P(A ⋂ B)/ P(B)
When working conditionally,
P(A ⋂ B) = ________
P(A | B) ⋅ P(B) OR
P(B | A) ⋅ P(A)
Baye’s Theorem
P(A | B) = [P(B | A) ⋅ P(A)] / P(B)
For mutually exclusive events A and B,
P(A ⋂ B) = ____
0
For independent events A and B,
P(A ⋃ B) = _________
P(A) + P(B) - P(A and B)
If A and B are not disjoint then,
P(A ⋃ B) = _________
P(A) + P(B) - P(A ⋂ B)
How do you determine if events are mutually exclusive?
P(A ⋂ B) = 0
For mutually exclusive events A and B,
P(A ⋃ B) = _________
P(A) + P(B)
Mutually exclusive is also known as ____
disjoint
The source of the randomness in a random variable is _______________, in which a sample outcome s ∈ S is chosen
according to a ___________
the experiment itself
probability function P
The probability mass function (PMF) of a discrete r.v. X is the function pX given by _________. Note that this is ________
if x is in the support of X, and _________ otherwise.
pX (x) = P(X = x)
positive, zero
The cumulative distribution function (CDF) of an r.v. X is the function F(X) given by __________
F(x) = P(X ≤ x)
The expected value of a discrete r.v. X whose distinct possible values are x1, x2, . . ., is defined by . . .
For any r.v.s X, Y and any constant c there are 4 primary manipulations to know for the E(X):
1. E(c) =
2. E(X + Y) =
3. E(X + c) =
4. E(cX) =
The variance of an r.v. X is ________
For any r.v.s X, Y and any constant c there are 4 primary manipulations to know for the Var(X):
1. Var(X + c) =
2. Var(cX) =
3. Var(X + Y) =
4. Var(X) >= 0 when . . .
An experiment that can result in either a “success” or a “failure” (but not both) is called a ___________
Bernoulli trial
Suppose that n independent Bernoulli trials are performed, each with the same success probability p. Let X be the number of successes. The distribution of X is called the ________ distribution
Binomial
Consider a sequence of independent Bernoulli trials, each with
the same success probability p ∈ (0, 1), with trials performed until a success occurs. Let X be the number of failures before the first successful trial. Then X has the_____________
Geometric distribution
An r.v. X has the __________ with parameter λ if it explains the distribution of . . .
1. the number of successes in a particular region or _________
2. a large number of trials, each with a ___________.
Poisson distribution
interval of time
small probability of success
The Poisson paradigm is also called the law of rare events. The interpretation of “rare” is that the _____ are small, not that _______ is small.
p
λ
An r.v. has a continuous distribution if its CDF is ___________.
We also allow there to be endpoints (or finitely many points) where the CDF is continuous but not differentiable, as long as . . .
differentiable
the CDF is differentiable everywhere else
Let X be a continuous r.v. with PDF f . Then the CDF of X is given by . . .
Probabilities for continuous random variables are specified by__________. This leads us to the special rule that P(X = x) = _______ for continuous random variables
the area under a curve
0
For sets of the form [a, b], (a, b], [a, b), (a, b), the CDF is dereived from the PMF by . . .
To get a desired probability for a continuous r.v. you . . .
integrate the PDF over the desired range
The PDF f of a continuous r.v. must satisfy the following two criteria:
The expected value (also called the expectation or mean) of a continuous r.v. X with PDF f is . . .
What is the power rule for differentiating a function?
What is the power rule for integrating a function?
The CDF of the continuous uniform is . . .
A continuous r.v. X is said to have the Exponential distribution with parameter λ > 0 if its CDF is . . .
A continuous distribution is said to have the memoryless property if a random variable X from that distribution satisfies . . .
the central limit theorem says that under very weak assumptions, the sum of a large number of i.i.d. random
variables has an . . .
approximately Normal distribution, regardless of the distribution of the individual r.v.s.
What are the three important symmetry properties that can be deduced from the standard Normal PDF and CDF?
For X ∼ N(μ, σ2) the standardized version of X is given by . . .
If X ∼ N(μ, σ2) then . . .
1. P(|X − μ| < σ) ≈
2. P(|X − μ| < 2σ) ≈
3. P(|X − μ| < 3σ) ≈
This is commonly known as _____ rule
- P(|X − μ| < σ) ≈ 0.68
- P(|X − μ| < 2σ) ≈ 0.95
- P(|X − μ| < 3σ) ≈ 0.997
68-95-99.7% rule
What is P(c < x < d) for X~Unif(a,b)?
(d-c/b-a)
How do you find percentile of a normal distribution?
X = μ + (z x σ)
where z corresponds to the Z score (X − μ /σ) of the desired percentile