Lecture 1 Flashcards
Give 3 definitions of convergence in probability.
2 based on delta, one based on delta and epsilon
The definition of weak consistency.
Give an intuitive definition of weak consistency.
for n large enough, the probability that theta_hat_n and theta differ at least by a quantity delta is very small.
Give an example of a simple consistent estimator.
Give 3 definitions of almost sure convergence.
Give an example of a sequence that converges in probability and prove that it converges in probability.
Give an example of a sequence that converges almost surely and prove that it converges almost surely.
Give the definition of uniform convergence.
Give the definition of complete convergence.
Give the relation between 3 modes of convergence.
Which is the direction of implication between convergence in probability and almost surely? Prove it.
Which is the direction of implication between convergence almost surely and complete convergence? Prove it.
State formally the relation between vector and element wise convergence.
clarify the key point.
State Slutzky theorem.
(with the two key assumptions)
State and prove Slutzky theorem.
State the two corollaries of Slutzky’s theorem.
and prove the second corollary.
Suppose we have a consistent estimator for M. Show that if we have a consistent estimator for sigma, we have a consistent estimator for the asymptotic variance of beta hat.
Give the definition of an unbiased estimator.
Give the definition of an asymptotically unbiased estimator.
State the simple conditions for which beta hat is an unbiased estimator for beta.
Give an unbiased estimator for sigma and state the condition for which it is unbiased.
u_i iid and z_i deterministic
Give an asymptotically unbiased estimator for sigma and state the condition for which it is asymptotically unbiased.
State both directions of the relationship between consistency and asymptotic unbiasedness and prove it.