Probability Flashcards
What does it mean for X_n to converge to X almost surely?
P(X_n -> X as n -> infinity) = 1
What does it mean for X_n to converge to X in probability?
For all e>0, P(|X_n - X| < e) -> 1 as n -> infinity
What does it mean for X_n to converge to X in distribution?
For every x such that F is continuous at x, F_n(x) -> F(x) as n -> infinity
Rank convergence in probability, in distribution and almost surely in order from strongest to weakest
- almost surely
- in probability
- in distribution
Prove that almost sure convergence implies convergence in probability
- Fix e>0
- Define the event A_N = {|X_n - X| < e for all n >= N}
- Suppose X_n -> X a.s.
- P(U A_N) = 1
- Also, lim N - > infinity P(A_N) = 1
- A_N implies|X_N - X| < e
- So P(|X_N - X| < e) -> 1
Prove that convergence in probability implies convergence in distribution
- Fix x such that F is continuous at x, and fix e>0
- F_n(x) = P(X_n <= x)
<= P(X_n <= x+e or |X_n - X| > e)
<= P(X_n <= x+e) + |X_n - X| > e)
-> F(x+e) as n -> infinity - So F_n(x) < F(x+e) + e for large enough n
- Similarly, using 1-F_n(x), F_n(x) > F(x-e) - e
- This implies F_n(x) -> F(x)
Prove that if X_n -> c, constant, in distribution, then X_n -> c in probability
- lim n->∞ F_Xn(c-e) = 0
- lim n->∞ F_Xn(c+e/2) = 1
lim n->∞ P(|Xn-X|>=e)
= lim n->∞ P(Xn <= c-e + Xn>=c+e)
= lim n->∞ P(Xn <= c-e) + P(Xn>=c+e)
= 0 + lim n->∞ P(Xn>=c+e)
<= lim n->∞ P(Xn>c+e/2)
= lim n->∞ 1 - F_Xn(c+e/2)
= 1-1
= 0
So lim n->∞ P(|Xn-X|>=e) <= 0
Also know lim n->∞ P(|Xn-X|>=e) >= 0
Hence lim n->∞ P(|Xn-X|>=e) = 0
State the weak law of large numbers
- X_1, X_2, … iid with finite mean µ
- S_n = X_1 + X_2 + … + X_n
- Then S_n / n converges in probability to µ as n -> infinity
Prove the weak law of large numbers
P(|Sn/n-µ|>=e)
<= Var(Sn / n)/e^2 (Chebyshevs)
= Var(X1)/ne^2 (Var(Sn/n) = σ^2/n)
-> 0 as n -> infinity
State Markov’s inequality
- X random variable taking non-negative values
- Then P(X >= z) <= E[X] / z
Prove Markov’s inequality
- Let X_z = z I{X ≥ z}.
- So X_z takes the value 0 whenever
X is in [0, z) and the value z whenever X is in [z, ∞). - So X ≥ X_z always
- Then E[X] ≥ E[X_z] = z E[I{X ≥ z}] = z P(X ≥ z).
- Rearrange
State Chebyshev’s inequality
- Y random variable with finite mean and variance
- Then for any e>0,
P(|Y - E[Y]| >= e) <= var(Y) / e^2
Prove Chebyshev’s inequality
P(|Y - E[Y]| >= e) = P((Y - E[Y])^2 >= e^2)
(by Markov’s) <= E[(Y - E[Y])^2] / e^2
= Var(Y) / e^2
State the Strong Law of Large Numbers
- X_1, X_2, … iid with mean µ
- S_n = X_1 + X_2 + …
- Then S_n / n converges almost surely to µ as n -> infinity
What are the differences between the WLLN and the SLLN?
- WLLN requires finite mean, SLLN doesn’t
- WLLN gives convergence in probability, SLLN gives convergence almost surely
State the Central Limit Theorem
- X_1, X_2, … iid with mean µ and variance σ^2 > 0
- S_n = X_1 + X_2 + … + X_n
- Then (S_n - nµ) / σsqrt(n) converges in distribution to N(0,1)
State the uniqueness theorem for PGFs
If X and Y have the same PGF, then they have the same distribution.
State the convergence theorem for PGFS
G_Xn (s) → G_X (s) as n → ∞, for all s ∈ [0, 1], if and only if p_Xn (k) → p_X (k), as n → ∞, for all k = 0, 1, 2, . . .
State the uniqueness theorem for MGFs
If X and Y are random variables with the same moment generating function,
which is finite on [−t0, t0] for some t0 > 0, then X and Y have the same distribution.
State the continuity theorem for MGFs
- MY and MX1, MX2, . . . are all finite on [−t0, t0] for some t0 > 0.
- If MXn (t) → MY (t) as n → ∞, for all t ∈ [−t0, t0],
- Then Xn → Y in distribution as n → ∞.
State the change of variables formula for changing from X,Y to U,V
-Let D, R ⊂ R^2
-Let T : D → R be a bijection with continuously differentiable inverse
S : R → D.
-If (X, Y) is a D-valued pair of random variables with joint density function f_X,Y
Then (U, V) = T(X, Y) is jointly continuous with joint density function:
f_U,V (u, v) = f_X,Y (S(u, v))|J(u, v)|, where J(u, v) = x_u y_v - x_v y_u