Asymptotic Theory Flashcards
What does it mean for a parameter to be consistent?
The estimatorconverges in probability to the true parameter.
What is the Weak Law of Large Numbers? Theorem 12
Given a sequence of iid random variables with a first moment, the first moment estimator converges in propability to the expectation of the random variable y.
What is Markov’s Inequality?
What is the WLLN for the vector case?
A sequence of random vectors Zn converges in probability to z as n goes to infinity if for all epsilon greater than zero, the limit as n goes to infinity of the probability that the norm of Zn - z is smaller than epsilon must equal 1. I.e
What is the Central Limit Theorem?
Given a sequence of iid random variables with mean and variance lower than infinity, then the square root of n multiplied by the difference between an estimator and its true parameter must converge in distribution to a Normal with mean zero and variance sigma squared.
What is the WLLN fundamentally about?What
It is about the convergence of estimators to point parameters.
What is the CLT fundamentally about?
It links estimates, parameters and convergence rate to a Normal distribution.
What are the three assumptions of the Lindeberg Levy CLT?
That the sequence of random variables is i.i.d, and that the first and second moments exist (smaller than infinity).
What are the two assumptions for WLLN to hold?
That the sequence of random variables be i.i.d and that its first moment exist, i.e bounded by infinity at the floor and ceiling.
What is the Continuous Mapping Theorem.
Given a random sequence of vectors and a continuous function g, then if the sequence converges in probability to z, then g of the sequence converges to g(z)
State Lyapunov’s Equality
(E|A|)^r)^(1/r) ≤ (E|A|^p)^(1/p) for 1 <= r <= p
State Minkowski’s inequality.
For any random mxn amtrices X and Y with p > 0, we have that
(E[|X+Y|^p})^(1/p) <= (E[|X|^p])^(1/p) + (E[|Y|^p])^(1/p)