4. Model Fit and multiple predictors Flashcards
b1 is an estimate of…
parameter for a predictor
-> direction\strength of relationship/effect
-> difference in means
b0 is an estimate of…
the value of the outcome when predictor(s) = 0 (intercept)
What do sums of squares represent?
Total error
Because sums of squares are totals, we can compare them only when…
Alternatively, we factor in…
When they are based on the same number of scores
the number of scores
When comparing sums of squares, we can get the average error by…
divide by a function of the number of scores
What is meant by the illusory truth effect (ITE)?
Repetition increases perceived truthfulness.
This is equally true for plausible and implausible statements
Each total sum of squared errors (SS(T)) has associated what?
Degrees of freedom (df)
What is degrees of freedom?
The amount of independent information available to compute SS
For each parameter (p) estimated we lose…?
1 piece of independent information
How do we get the residual sum of squared errors (SS(R))?
To begin with we have N pieces of independent information.
To get SS(R) we estimate two parameters (b0 and b1).
The model sum of squared errors (SS(M)) is a rotation of the null model. What one piece of info are the null model and the estimated model distinguished by?
The slope (b1)
(Note, the intercept, b0, co-depends on the
slope - it is not an independent piece of information)
A sum/total of squared errors depends on…
The amount of information used to compute it
The average or mean squared error can be computed by…
Dividing the SS by the amount of information used to compute it
What is the Mean Squared error R?
- Average residual/error variability (variability between the model and the observed data) - How badly the model fits (on average)
What is the mean squared error M?
- Average model variability (difference in variability between the model and the grand mean)
- How much better the model is at predicting Y than the mean
- How well the model fits (on average)
If the model results in better prediction than using the mean, then ____ should be greater than ____
MS(M) should be greater than MS(R)
What is the F statistic?
The ratio of MS(M) to MS(R) (the good to shit ratio)
What is R^2?
The proportion of variance accounted for by the model
The Pearson correlation coef cient between observed and predicted scores squared
What is adjusted R^2?
An estimate of R2 in the population (shrinkage)
What do the following three ways of entering predictors mean?
1. Hierarchical
2. Forced entry
3. Stepwise
- Experimenter decides the order in which variables are entered into the model
Best for theory testing - All predictors are entered simultaneously
- Predictors are selected using their semi-partial correlation with the outcome
Can produce spurious results
Use only for exploratory analysis
We evaluate fit of a general linear model using…?
Sums of Squared Errors (SS)
What do the following stand for?
1. SS(T)
2. SS(R)
3. SS(M)
- SST = the total variance/error in observed scores
- SSR = the total variance/error in predicted scores
- SSM = the total reduction in variance/error due to the model
What do the following mean squared errors (MS) mean?
1. MS(R)
2. MS(M)
- MSR = the average variance/error in predicted scores
- MSM = the average reduction in variance/error due to the model
What is ‘F’?
the average variance accounted for by the model compared to the model’s error in prediction
bs are the change in the outcome associated with a unit change in the predictor when…
when others predictors are held constant (in red so important lol)