Topic 2: Simple Linear Regression Flashcards
In SLR model, if xbar = 0 then what does beta^_0=?
beta^0 = ybar
Rearrange a SLR model to equal beta^_0
beta^_0 = ybar - beta_1*xbar
Write the residual for observation i, u^_i, in terms of y_i’s
u^_i = y_i - y^_i
What is the residual for observation i, u^_i equal to in terms of a SLR model?
u^_i = y_i - beta^_0 + beta^_1*x_i
How to derive beta_1 from sums?
[the sum of (x_i - xbar)(y_i - ybar)] / [the sum of (x_i - xbar)^2] which is the sample covariance between x and y, divided by the sample variation.
How would you find the coefficients of a model if given a table of the data?
find ybar and xbar, then use the formula for beta_1 (sample cov/sample var), then put these three into a formula to get beta_0
What is the formula for fitted values, Y^_i , of observation i in terms of a SLR model?
(y^_i) = (beta^_0) + (beta^_1)*(x_i)
What is the sum of squared residuals, SSR?
the sum of (u^_i^2) or the sum of (y_i - y^_i)^2
What is a simple regression model used for?
Used to study the relationship between two variables as a tool for empirical analysis.
Names for y variable in regressions?
dependent, explained, response, predicted, regressand
Names for x variable in regressions?
independent, explanatory, control, predictor, regressor
Describe the linear relationship in a simple regression model.
If the u term is held fixed at zero, x will have a linear affect on y, such that for every increase of a unit in x, there is a beta_1 change in y.
Besides E(u)=0, how can you assure a ceteris paribus relationship in your SLR model?
The zero conditional mean assumption of E(u|x) = 0
What is the conditional expectation function for the population regression function?
E(y|x) = (beta_0) + (beta_1)x, when the zero conditional mean assumption holds
What is the minimizing function for the derivation of the OLS estimators in SLR?
the sum of [(y_i) - (beta_0) - (beta_1)*(x_i)]^2. This is the ordinary least squares we are trying to minimize (the coefficients)
What is the residual of a SLR equal to in terms of parameters?
u = y - (beta_0) - (beta_1)(x)
What are the first order conditions of the minimizing function for OLS estimators in SLR?
Take the partial derivative of the minimizing functions with respect to beta0 and beta1 to get two FOCs of: -2 sum of [(y_i) -(beta^_0) - (beta^_1)*(x_i)) = 0
What is another way to write the formula for (beta^_1) = [the sum of (x_i - xbar)(y_i - ybar)] / [the sum of (x_i - xbar)^2]
(beta^_1) = [the sum of (y_i)(x_i - xbar)] / [the sum of (x_i - xbar)^2]
What is homoskedasticity?
The errors in a regression have constant variance conditional on the explanatory variables, thus u has the same variance given any value of x.
Under SLR1-4, do OLS estimates equal the true population parameters?
No, but under SLR1-4 the expected value of the estimates are equal to the parameter, making it unbiased.
Would a violation of the zero conditional mean assumption make an estimator biased?
Yes.
What is the expected value of residuals equal to?
0, both the sum and mean of residuals is 0.
What is the bias in the estimator beta^1 if E(u_i) does not equal 0?
The bias in the estimator is equal to the sum of [(x_i - xbar)(u_i)] / the sum of (xi -xbar)^2
What is the formula for the total sum of squares (SST)?
SST = the sum of (yi - ybar)^2 and is the measure of total sample variation in yi (thus the outcomes yi - ybar)
What is the explained sum of squares (SSE)?
SSE = the sum of [(y^i - ybar)^2 and is the measure of sample variation in the fitted values (thus y^i)
What is the residual sum of squares (SSR)?
SSR = the sum of u^i^2, the measure of variation in u^i
What is the formula for total variation in y? (explained and unexplained?)
SST = SSE + SSR
Stata: how do you get a frequency table of a variable?
tab var
What is Var(beta^1)?
= sigma^2 / SSTx where SSTx = (sum xi- xbar)^2
What is Var(y|x)?
= sigma^2, which is a constant and a measure of the variation in y given x.
What is SLR 5?
Homoskedasticity, u has the same variance given any value of x, Var(u|x) = sigma^2 which is the error variance.
What is Var(beta1)?
=0
State the change in a level level model.
change in y = beta1 change in x
State the change in a level log model.
change in y = (beta1/100)% change in x
State the change in a log level model
% change in y = (100beta1) change in x
State the change in a log log model.
% change in y = % change in x
How can you remember the change of functional models?
A log with the variable makes it % change. For level-log beta1/100. For log-level beta1*100