Chapter 7/8 Flashcards
We say that c is a median of a random variable X if _______________
P(X ≤ c) ≥ 1/2 and P(X ≥ c) ≥ 1/2
For a discrete r.v. X, we say that c is a mode of X if it maximizes the PMF: _____________. For a continuous r.v. X with PDF f, we say that c is a mode if it maximizes the PDF: _______________
P(X = c) ≥ P(X = x) for all x
f (c) ≥ f (x) for all x
Measures of central tendency are . . .
Measures of spread are . . .
Skewness measures . . .
mean, median, mode
standard deviation, variance
asymmetry
Describe the 3 types of skewness
negative/left
not skewed
positive/right
Let X be an r.v. with mean μ and variance σ2. For any positive integer n, the nth moment of X is __________, the nth central moment is ___________, and the nth standardized moment and the nth standardized moment in . .
E(Xn)
E((X − μ)n)
The skewness of an r.v. X with mean μ and variance σ2 is the _________ standardized moment of X:
third
Let X be symmetric about its mean μ. Then for any odd number m, the mth central moment E(X − μ)m is ________
0 if it exists
As with measuring skew, no single measure can perfectly
capture the tail behavior, but there is a widely used summary
based on the fourth standardized moment. The ______ of an r.v. X with mean μ and variance σ2 is a
shifted version of the fourth standardized moment of X:
kurtosis
The moment generating function (MGF) of an r.v. X is . . . Otherwise we say the MGF of X ___________
M(t) = E(e^tX ), as a function of t, if this is finite on some open interval (−a, a) containing 0
does not exist
For X ∼ Bern(p), e^tX takes on the value et with probability p and the value 1 with probability q, so M(t) = ____________ Since
this is finite for all values of t, the MGF is defined _____________
E(e^ tX ) = pe^t + q
on the entire real line
For X ∼ Geom(p), M(t) = E(e^tX ) = _____
Let U ∼ Unif (a, b). Then the MGF of U is . . .
The MGF of X ∼ Expo(λ) is . . .
Give 3 reasons as to why the MGF is important.
Given the MGF of X, we can get the nth moment of X by evaluating the nth ____________ of the MGF at 0:
The Taylor expansion of M(t) about 0 is:
derivative
The MGF of a random variable determines its distribution meaning that . . .
If two r.v.s have the same MGF, they must have the same distribution. In fact, if there is even a tiny interval (−a, a) containing 0 on which the MGFs are equal, then the r.v.s must have the same distribution.
If X and Y are independent, then the MGF of X + Y is the _________ of the individual MGFs
product
The MGF of a Bern(p) r.v. is pe^t + q so the MGF of a Bin(n, p) r.v. is . . .
Find the MGF of the NBin(r, p) distribution using Geom(p)
If X has MGF M(t), then the MGF of a + bX is . . .
Give the MGF of the standard normal and normal distributions
Give the MGF of the exponential distribution
A ______________ on a probability gives a provable guarantee that the
probability is in a certain range.
These inequalities will often allow us to narrow down the range of possible values for the exact answer, that is, to . . .
bound
determine an upper bound and/or lower bound
The Cauchy-Schwarz inequality lets us bound E(XY) in terms of the marginal second moments E(X2) and E(Y2) such that for any r.v.s X and Y with finite variances . . .
Markov’s inequality gives . . .
Let X be a non-negative random variable and a > 0 be a scalar, then . . .
an upper bound on the probability that
a non-negative random variable is greater than or equal to some positive constant
For Chebychev’s inequality, Let X be a random variable with finite variance σ2, i.e., σ2 < ∞, then for any constant k > 0:
Let the number of items produced in a factory during a week be a continuous random variable with E(X) = 50. Give
an upper bound for P(X > 75).
Explain the weak law of large numbers.
Explain the central limit theorem and its use in approximation.