Lecture 4- Model Fit Flashcards
b1 estimates what
Parameter for a predictor
- Direction of relationship/effect
- Difference in means
b0 estimates what
The value of the outcome when predictors =0 (intercept)
Sums of squares represent
Total error
Because sums of squares are totals they can only be compared when
They are based on the same number of scores
df=
Degrees of freedom
N- p
Number of scores - number of parameters
What is SST
Total sum of squares
Total variability (variability between scores and the mean)
What is SSR
Residual sum of squares
- Total residual/error variability (variability between the model and the observed data)
- How badly the model fits (in total)
What is SSM
Model sum of squares
- Total model variability (difference in variability between the model and the grand mean)
- How much better the model is at predicting Y than the mean
- How well the model fits (in total)
Each sum of squared has
Associated degrees of freedom
The df is what in relation to SS
The amount of independent information available to compute SS (Sum of squares)
For each parameter estimated we lose
1 piece of independent information
To get SST we estimate
1 parameter
dfT=
N- p
Number of pieces of info - parameter
To get SSR we estimate
2 parameters (b0 and b1)
dfM=
dfT- dfR