Lecture 4: Linear Regression and OLS Flashcards
Write our a standard bivariate regression. What is the predicted value and the residual?
Draw a simply bivariate regression on a graph and lable the residual and fitted value for one point.
Write out the MSE for a bivariate regression
What are the 3 algebraic propertise of the OLS estimator?
What are the 4 assumptions we need to make for an OLS regression?
Under the 4 OLS assumptions, what three properties do the OLS estimator have?
- Unbiase
- Consistent estimator
- Distribution of OLS estimator is well approximated by a Normal distribution
What is homoscedastic vs heteroscedastic errors?
If we assume that the error term is homoskedastic then what does the Gauss-Markov Theorem state?
How do we do a hypothesis test on multiple coefficients?
What should we do if we suspect our errors are heteroscedastic?
What is the algbra to set up the R^2?
What is the adjusted R-squared? Why do we want it?
What is the Dummy Variable Trap?
How do we do regressions for a binary variable?
How do we interpret the LPM?
What does it mean if a regression is linear in the variables?
What does it mean if a regression is linear or not in the parameters?
When is a function linear? When might a function not be linear?
What is a quadratic term?
Why might I do log transformations?
Why might we use log transformation to alter distributions?
What are the 3 ways that I can use log transformation to change the interpretations of coefficients?
Write out the table of the different interpretation of log transformations
How can I use interaction terms with dummies to change the interpretation of dummies?