lecture 6 Flashcards
what is an general issue of test statistics
they depend on an unknown population varience
what happens when the OLS has large samples
as sample size increase t becomes like a normal distribution
sample is large the varience will tend to 0
what happens to OLS when the sample is too big
asymptotic distribution of the OLS estimator is degenerate
the PDF collapses onto a single value as the sample size
becomes arbitrarily large.
How does increasing the sample size affect the OLS estimator PDF on a graph?
gets thinner and tall around the mean get closer and closer to a vertical line.
how can compare estimators on the asymptotic varience
transform the distrbution which produces a non-zero finite variance
what are OLS regression residuals defined as
difference between the actual values of Y and the fitted
values from the regression equation.
are regression residuals and equation errors the same
The regression residuals are not the same thing as the
equation errors which are unobservable.
what is the 1st important propery of regression residuals
they sum to zero
and
OLS residuals are uncorrelated with the X variables
Note cov(X,u) = 0 may or may not be true. This is a matter of assumption rather than a mathematical property of the model
how do you intereptate the slope coefficient
it gives the marginal effect on the endogenous variable of an increase in the ecogenous variable
e.g. I = y0.1809
GDP of £1m
leads to an increase in investment of £180K.
what do log linear regressions tell you
elasticity of y with respect to X.
e.g.
Ln(I) = 1.3463ln(y)
This equation tells us that a 1% increase in GDP will result in
a rise of about 1.35% in investment spending
what is the intercept? Can it be found when X is 0?
no: Although this is mathematically true, the zero value of X
often lies well outside the range of the data.
When can you not take logs of a regression?
If even on of the values is negative as Ln is not defined at negative values
what is asymptotic variance
Asymptotic variance refers to the variance of a statistic when the sample size approaches infinity. So when the sample size is “large”, there is a theoretical reason to believe that the finite-sample variance can be reasonably approximated by the asymptotic counterpart
when is the prediction error varience smallest
X equal its mean value. it gets larger the further X is from its mean
in terms of prediction variance, if the estimator is unbiased what is the expected value.
0