Single Linear Regression - Estimation Flashcards
Interpretation of Beta1
OLS Etimators
Beta1=(Sample covariance)/(Sample variance of X)
Beta0= Y(sample)-Beta1X(sample)
R-squared definition
The fraction of the sample variance of Yi that is explained or predicted by Xi
= ESS/TSS or 1-SSR/TSS
Bounded between 0 and 1 where if 1 all the variation in Y is explained all in X
TSS, ESS + SSR relationship
ESS = explained sum of squared
TSS = total sum of squares
SSR = sum of squared residuals
TSS = ESS + SSR
Least Squares Assumption #1
Independence
Assumption that the conditional expectation of ui (error) on X is zero meaning the correlation is zero. This then means that E(Y|X)=Beta0+XiBeta1 (since the error term conditional is 0)
Least Squares Assumtpion #2
IID
The dataset are independent and identically distributed - ie. all terms are independent and have the same probability distribution
Least Squares Assumption #3
No outliers
Large outliers are unlikely - in econometrics we look at the data first and remove legitimate outliers
What do the three least squares assumptions tell us about the OLS estimators (Beta0 and Beta1)
- Unbiased - the expected value of beta is equal to the true value of beta - E(Beta0-sample)=Beta0
- Consistent - as the sample size n gets large, the Beta values will get closer to their true values with high probability
- Under CLT when n is large the marginal distributions of Beta0 and Beta1 are normal