Linear Regression Flashcards
Formula
y (dependent variable) = Intercept + slope x X + error
Intercept
predicted value of Y when X =0
slope
Change in Y for one unit change in X
Sope coefficient
Cov X Y
/
Variance of X
Intercept
Mean Y - B1 x Mean X
assumptions
-Linear relaionship
-Variace of error terms is constant (homoskediacity)
homo(same) skediacity
- normal distribution, independant distribution no correlation
SSE
Residual - predicted value squared (unexplained)
RSS
Predicted val - mean squared (explained)
SST
RSS + SSE
Anova table
DF
regression RSS=1
Residual SSE =n-2
total SST = n-1
Mean square regression
RSS / DF
mean square error
sse / df
F stat - does single independant variable have explanatory power
MSR / MSE
Rsquared coefficient of determination (how much of the variation have we been able to explain w/independent v and linear relationship)
RSS/ SST (explained / total)
R2 = Pxy2
standard error of estimate = sqr MSE
t test
N-2 DF to get t value combined with sig level
slope - hypo value
/
s error
confidence interval
Predicted value of Y
Standard Error
N observations
functional forms
when no explanatory power in linear relationship
log = relative
lin = absolute
log y lin x
lin log
log log