3. Discrete Random Variables Flashcards
What is a pmf?
For a discrete random variable X, we define the probability mass function (p.m.f. for short) pX to be pX(x) := P(X = x).
- pX(x) ≥ 0 ∀x ∈ R.
- Probability mass functions must ‘sum to 1’
What is a cdf?
We define the cumulative distribution function, FX of a random variable X (discrete or continuous) to be FX(x) := P(X ≤ x).
what is the cdf in terms of the pdf?
FX(x) := P(X ≤ x) = the sum from a≤x,a∈RX of all pX(a).
What is a quantile and percentile?
For α ∈ [0, 1] the α quantile (or 100 × α percentile) is the
smallest value of x such that FX(x) ≥ α.
The median is the 0.5 quantile.
Define expectation (mean).
The expectation E(X) of a discrete random variable X is
defined as
E(X) := the sum from (x∈RX) of all xP(X = x).
Define Variance
Var(X) : = E{X − E(X)}^2 = E{(X − µX)^2}.
Var(X) = E(X^2) − E(X)^2
E(aX+b) and Var(aX+b)
E(aX+b) =aE(x) + b
Var(aX+b) = a^2Var(X)
Bernoulli Distribution
A Bernoulli r.v. can take values of 0 and 1 only. px(1) = p.
E(X) = p, Var(X) = p(1-p)
Binomial Distribution
pmf: pX(x) = (n! / x!(n − x)!) p^x (1 − p)^(n−x).
E(X) = np
Var(X) = np(1 − p)
Poisson Distribution
Used to represent count data: to number of times something happens within some finite interval.
pX(x) = e^(−λ) λ^x / x!
E(X) = Var(X) = λ.
Geometric Distribution
pX(x) = (1 − p)^(x−1) p FX(x) = 1 − (1 − p)^x E(X) = 1 / p Var(X) = (1 − p) / p^2.
Joint pmf
P{(X=x)∩(Y=y)}
Joint cdf
P{(X ≤ x) ∩ (Y ≤ y)}
The expectation of the sum of two random variables E(X+Y) is
E(X) + E(Y)
Covariance
The covariance between two random variables X and Y is
Cov(X, Y) = E{(X-µx)(Y-µy)}
or, Cov(X, Y) = E(XY) - E(X)E(Y)