Single Linear Regression - Testing Flashcards
3 steps for testing hypotheses about Beta0 or Beta1
- Compute OLS estimate and standard error
- Compute t-stat
- Compute p-value
- Reject/fail to reject based on significance level
note: r will compute the t-stat and p-value for the coefficents test null=0
Critical values from significance level
90%, 95%, 99%
95% = 1.96
90% = 1.65
99% = 2.58
What do you do to change the CI to account for a change in X?
You simply multiply the upper and lower bound by the value change of X
Homoskedasticity of the error term
If the variance of the errors is the same no matter what Xi is
Heteroskedasticity of the error term
If the variance of the errors vary with Xi
Why can an assumption of homoskedasticity be an issue?
Because it assumes that the error term does not vary depending on the dummy case e.g. with earnings for males and females homoskedasticity would assume that the variance of earnings are the same which we know is untrue therefore, we always assume heteroskedasticity otherwise our test will be wrong.
Gauss-Markov theorem (a key result under homoskedasticity)
Under the 3 OLS assumptions if the error term ui is homoscedastic then the OLS estimators of Beta0 and Beta1 are Best Linear Unbiased Estimators of the true linear regression parameters.
Explain BLUE
Best: OLS estimator is the most efficient estimator (smallest sampling variance and thus, smallest standard errors)
Linear: OLS estimators are linear functions
Unbiased: Expected values of the OLD estimators are equal to their true values in the population
Estimator: OLS values are estimators of the population true values.
HOWEVER, this is virtually never relevant i practice
Trick for testing a hypothesised change in Y
Figure out what Beta would have to be to yield this change and test the hypothesis that the Beta is equal to that value.