4. Model Fit and multiple predictors Flashcards
b1 is an estimate of…
parameter for a predictor
-> direction\strength of relationship/effect
-> difference in means
b0 is an estimate of…
the value of the outcome when predictor(s) = 0 (intercept)
What do sums of squares represent?
Total error
Because sums of squares are totals, we can compare them only when…
Alternatively, we factor in…
When they are based on the same number of scores
the number of scores
When comparing sums of squares, we can get the average error by…
divide by a function of the number of scores
What is meant by the illusory truth effect (ITE)?
Repetition increases perceived truthfulness.
This is equally true for plausible and implausible statements
Each total sum of squared errors (SS(T)) has associated what?
Degrees of freedom (df)
What is degrees of freedom?
The amount of independent information available to compute SS
For each parameter (p) estimated we lose…?
1 piece of independent information
How do we get the residual sum of squared errors (SS(R))?
To begin with we have N pieces of independent information.
To get SS(R) we estimate two parameters (b0 and b1).
The model sum of squared errors (SS(M)) is a rotation of the null model. What one piece of info are the null model and the estimated model distinguished by?
The slope (b1)
(Note, the intercept, b0, co-depends on the
slope - it is not an independent piece of information)
A sum/total of squared errors depends on…
The amount of information used to compute it
The average or mean squared error can be computed by…
Dividing the SS by the amount of information used to compute it
What is the Mean Squared error R?
- Average residual/error variability (variability between the model and the observed data) - How badly the model fits (on average)
What is the mean squared error M?
- Average model variability (difference in variability between the model and the grand mean)
- How much better the model is at predicting Y than the mean
- How well the model fits (on average)