3: OLS Regressions Flashcards
Yi
dependent variable/outcome of interest
Xi
independent variable/explanatory variable
ui
residual/error term
ordinary least squares (OLS)
minimising squared residuals and squared prediction errors
OLS estimator chooses regression coefficients by minimising the sum of squared prediction errors
for linear relationship, sample covariance/sample variance
multicollinearity
when 2 variables are perfectly collinear (one is a linear function of the other)
- no variation in X1 condition on X2, and vice versa
OLS cannot estimate slope parameters separately
goodness of fit
how much of the variation in Y does the regression explain
R^2
percentage of total variation in Y explained by estimated regression
ESS/TSS = 1 - SSR/TSS
in univariate regression model, R^2 = sample correlation ^2
adjusted R^2
deflating R^2 by some factor so R^2 doesn’t keep increasing when you add more regressors/explanatory variable