Lessons Plans Flashcards
Building and Estimating a Model (5 step test)
- ) Hypothesize a relation between X variables and Y
- ) Use sample data to estimate the unknown parameters in the model
- ) Compute the standard errors of the parameters
- ) Statistically evaluate the usefulness of the model
- ) Use the model for prediction
When do we say there is heteroskedasticity in the data?
The variance of e changes when X changes (the variance of e is not constant and does not equal variance^2)
What are the assumptions about the error term?
- ) E[e] = 0
- ) the variance of e does not change when X changes
- ) the probability of e is normal. (Generally, this is difficult to know whether it’s not true)
- ) The value of ei is independent of ei for any two observations I and j. Example, if any two of you of you in the room work for the same company, your errors in the regression of your income on your parent’s income would likely be correlated. `
Test whether B1 is “significantly” different from zero
1.) Estimate variance^2
2.) a = compute a confidence interval around Bhat1
b = compute the T-statistic for the difference between Bhat1 and 0
Total Sum of Squares (TSS)
the sum of squares of the deviations from the mean of the dependent variable
Explained Sum of Squares
the sum of squares of the deviations from the mean of the predictions (i.e. predictions of the dependent variable Y in this case)
Residual Sum of Squares (RSS)
the sum of squares of the residuals (i.e. the errors in the predictions)
How is the Total Sum of Squares (TSS) partitioned into two parts?
This partitioning is called variance decomposition. We fit a regression line and see how much of the variation can be explained by the fitted relationship (the explained sum of squares, ESS) and how much is left over (the residual sum of squares, RSS).
the smaller the RSS, relative to TSS, the better the regression line appears to fit.
R^2is defined as the …
fraction of the total variability in Y that is explained by the regression model. R^2 is measure of “goodness of fit” and (rarely) called the “coefficient of determination.”
R^2 =
Variation in Y that is explained by X/Total variation in y
OLS (Ordinary Least Squares, also known as minimizing squared errors, which is what we’ve been doing) gives us
Betas that minimize the RSS (keeps squared residuals as small as possible), thus gives largest possible R^2
Adjusted R^2
adjusts for the number of independent variables (i.e. predictors) in the regression.
When moving to multivariate regressions the interpretation of R^2 will be…
“what fraction of the variation in Y can be explained by the X variables.”
What are the two ways to use a bivariate prediction equation?
- ) To estimate the mean value of Y for a population with a given X.
- ) To estimate the value of Y for an individual with a given X.
The smaller the RSS is, relative to TSS…
the better the regression line appears to fit