Chapter 10 Flashcards
Variation in Y
Definition - Simple linear Regression
Explains variation in a DEPENDENT variable in terms of the variation in a SINGLE INDEPDENT variable.
Definition - Dependent
variable whos variation is explained by the independent variable.
EXPLAINED variable or the predicted variable.
Definition - Independent
variable used to explain the variation of the dependent variable.
Ordinary Least Squared Line
Error Term
Intercept Interpretation for ABC stock excess return of -2.3
Slope interpretation for ABC stock excess return of 0.64%
What does it mean when variable have a hat ?
Hat means predicted value
Slope Coefficient
Intercept
Regression l ine passes through (Xbar,Ybar)
Assumption of linear Regression
(1) Linear relationship with dep and indep
(2) Variance of error terms is constant (homoskedasticity)
(3) Error terms are independently distributed (i.e. uncorrelated with each other)
(4) Error terms are normally distributed.
Violation of 3 and/or 4 is called serial of auto-correlation.
homoskedasticity
case where prediction errors all have the same constant variance
hetroskedasticity
variance of the error terms not being constant.
total sum of squares (SST)
ANOVA definition
Analysis of variance (ANOVA) is a statistical procedure for analyzing the total variability of the dependent variable
sum of squares regression (SSR)
mean square regression (MSR) i
K = 1 for simple regression (number of independent variables)
sum of squared errors (SSE)
Relationship with SST, SSR, and SSE
mean square error (MSE)
K = 1 for simple regression (number of independent variables)
coefficient of determination (R2)
Definition
Example R2 of 0.63
Relationship with R2 and correlation coeff, r
Standard Error of Estimate (SEE)
The F-Statistic
Definition
Tail
Formula
Hypothesis Test of a Regression Coefficient
Hypothesis Test of a Regression Coefficient (t b1)
Standard Error of Slope Coefficient
t-test for simple linear regression is equivalent to what?
-test for a simple linear regression is equivalent to a t-test for the correlation coefficient between x and y:
r is corr coeff
simple regression, this is the predicted (or forecast) value of Y:
Confidence Intervals for Predicted Values
Definition and formula (two parts)
Confidence intervals estimate a prediction interval round a predicted value.
where:
SEE2 = variance of the residuals = the square of the standard error of estimate
sx2 = variance of the independent variable
X = value of the independent variable for which the forecast was made
Functional Forms
When relationship between X and Y are NOT linear, fitting a linear model is biased prediction.
If you transform one or both variable by taking their natural log, you might make relationship between the transformed variable linear.
Natural log Tranformation
- Log-Lin
- Lin-Log
- Log-Log
Y is independent
X is dependent
- taking natural log of the Y variable only.
This is if the dependent variable is logarithmic, while the independent variable is linear - taking natural log of the X only.
This is if the dependent variable is linear, while the independent variable is logarithmic. - taking natural log of both X and Y
Both the dependent variable and the independent variable are logarithmic
Log-Lin Model
Lin-Log Model
Log-Log