Chapter 2 Asymptotic theory Flashcards
Continuous mapping theorem
If we apply a continuous function to a vector that converged to another vector, it now converges to the same vector but applying the continuous function to it.
Note that the inverse is a continuous function.
Slutsky’s lemma
How does Slutsky’s lemma apply to a matrix times rv?
Kolmogorov (strong) Law of large numbers
Average converges in probability to the mean IF Z_I IS IID
Stationarity
The distribution is invariate over time
Ergodicity
The two groups become independent when n goes to infinity
Ergodic theorem
iid is replaced by the ergodic theorem
LLN to CLT 1- Classical Lindenberg-Levy CLT
z_i is iid.
In this case, the dependence that ergodicity restricts is not enough for the classical LL-CLT
LLN to CLT 2- Martingale Difference CLT
Martingale difference sequence means that there is no serial correlation. If we have this + SE, we have MD-CLT
Note: the conditional variance remains unrestricted
Linear regression assumptions (AT): 1
Linearity
Linear regression assumptions (AT): 2
Stockastic assumption.
yi and xi are jointly stationary ergotic
Linear regression assumptions (AT): 3
Predeterminedness assumption: E[x_iepsilon_i]=0
Linear regression assumptions (AT): 4
Sigma_xx=E[xx’] is non singular (asymptotic full rank condition)
Linear regression assumptions (AT): 5
x_i epsilon_i is a MDS with variance S=E[xx’epsilon^2], where S is a non-singular variance covariance matrix
Consistency in OLS with Asymptotic Theory. Which assumptions were needed?
1 to 4.
Linearity for the formula, Stockastic assumption (StatErg) to apply ET, Predeterminedness to cancel out the second term, non-singularity for the ergodic theorem.
Thus, with infinite data we would find beta.
Since we don’t have infinite data, we will have uncertainty. Convergence with finite data: CLT
With assumptions 1 to 5, OLS is asymptotically normal
Since we do not know the asymptotic variance, we need an estimator for it. First, build the estimator for S
What is the limiting distribution of b?
What is the Robust SE for b?
What is the limit of the robust t^* test for scalar hypothesis?
We need CMT for avarhat to converge to avar, and slutsky to combine them
How do we test linear hypothesis? what is the limiting distribution of this test?
What does specification testing tell us
If x_iepsilon_i is MDS and x_i includes a constant, then epsilon_i is also MDS, therefore it is serially uncorrelated
What is autocovariance? And autocorrelation? How do we test MDS (H0)?
Assuming z_i is stationary.
H0: rho_1=—=rho_P=0
Where P is a fixed integer. If we don’t reject H0, then there is no serial correlation
What are the estimators for autocovariance and autocorrelation? are they consistent? If MDS, what do they converge to?
Propose a statistic to test serial correlation. What happens to it if there is serial correlation?
note: this test statistic doesn’t always converge as we want it to
Since we can’t observe epsilon, how can we apply the BPS?
we can change rho tilde for rho hat if we can change gamma tilde for gamma hat.
We substitute ei for epsilon i in the second equation and easily find that it converges to gamma tilde in probability. But we still beed
Modified BPS
Where phi^{-1} is a standardizing matrix.