10. Asymptotics Flashcards
Convergence in probability
lim Pr{|Xn-X| < epsilon }=1
Convergence in mean of order k
lim E{|Xn-X|^k}=0
if k=2 then we have the quadratic mean
Convergence in distribution
lim Fn(x) = F(x)
Relationship between convergences
- convergence in k-th mean implies convergence in probability which implies convergence in distribution
- convergence in distribution to a real number implies convergence in probability
Proof that conv. in k-th mean implies conv. in prob
E{|Xn-X|^k}= E[|Y|^k] = int |y|f_Y
= int (|y|=> j) + int (|y| < j)
=> j^k int (|y|=> j) f_Y = j^k Pr{|Y| => j}
then 0<= Pr{|Xn-X| => e} <=1/e^k E{|Xn-X|^e}
Central Limit th.
(X* - m)/sqr(v/n) ~ N(0,1)
if X* is the sample mean
Slutsky th.
Let Xn and Yn be 2 sequences, with Xn convergent in distr. to X and Yn convergent in prob. to y then:
- Xn +/- Yn = X+/-y
- XnYn = Xy
- Xn/Yn = X/y
Mann-Wald th.
if Xn is st:
sqr(n) (Xn - @) ~ N(0, v(@))
with g a differentiable funct, non vanishing in the first derivative, then:
sqr(n)(g(Xn) - g(@)) ~ N(0, v(@)*[g’(@)]^2)
it allows to construct an asymp. normal estimator from another
Consistency
An estimator for g(@) is consistent if it’s convergent in probability (or in quadratic mean) to g(@)
Asymptotically normal estimator
if for some V, we have that sqr(n) (Tn - @) ~ N(0, V)