Topic 6: Asymptotics Flashcards
What is consistency?
An estimator converges in probability to the correct population parameter as the sample size grows without bound.
How should you start the proof for consistency of the OLS estimator in the simple linear regression model?
Start with the estimate, beta^j = betaj + [sum(xi-xbar)ui] \ [sum(xi-xbar)^2]
As you divide both the numerator and denominator of the second term in the consistency proof by 1/n what happens?
As n grows without bound, the numerator goes to the Covariance of (x,u) and the denominator goes to the Variance of x
What does Cov(x,u) converge on in probability?
Cov(x,u) = 0
What is the constant elasticity in a regression?
Found in the parameter that is part of a log-log model, for a one percent change in x, y changes by beta1 percent. Take care if either parameter is already in percent units, then it is percentage points.
If x and u are correlated, what is the inconsistency in the estimator equal to?
Cov(x,u)/Var(x)
What happens to the distribution of an estimator, beta^j, as the sample size increases?
It becomes more tightly distributed around the parameter, betaj.
Stata: How can you get a t value from a regression output in Stata?
Under the column t, it is the coefficient/std. err.
What is asymptotic normality?
The idea that OLS estimators are approximately normally distributed in large enough sample sizes.
What two properties are needed for statistical inference in multiple regression analysis?
Consistency and asymptotic normally together allow for hypothesis testing about parameters in the MLR model.
Can we use the t stat, (beta^j - betaj)/se(beta^j), for a standard normal distribution?
Yes, as long as n is large enough, even without consistency
What is root n convergence?
Standard errors can be expected to shrink at the same rate as the inverse root of the sample size, 1/sqrt(n)
Is the root mean squared error of the regression (RMSE) the standard error of a parameter estimate?
No, it is the standard error of the regression in ML analysis. (for the population, not sample)
A consistent parameter estimate will converge on what as the random sample size increases to infinity?
The true population parameter.
If an unbiased population parameter estimate is averaged, the expected value will equal what as the number of random samples goes to infinity?
The true population parameter