Visualising Regression & Hierarchical Regression (W10) ✅ Flashcards
What are the coefficients for multiple regression?
Zero-order2: total variance explained by all predictors, as a proportion of the total variance in y
Part2: unique variance explained by each predictor (no shared variance between each preditor) expressed as a proportion of the total
variance in y
Partial2: unique variance explained by one variable as a proportion of the variance
in y minus other predictors’ explained variance
What is meant by hierarchical regression, how does it compare to simple/multiple. regression?
- Assesses whether adding predictor variables allows you to explain additional variance in the outcome variable
- Hierarchical regression: predictor variables are entered in a specified order of ‘steps’
-> The relative contribution of each ‘step’ can be evaluated in terms of what it adds to the prediction of the outcome variable - WHILE multiple regression adds all the predictor variables in at the same time
-> Only tell you the overall explained variance and separate variance
Why use Hierarchical Regression?
To examine the influence of predictor variables(s) on an outcome variable, after ‘controlling for’ (rule out) the influence of other variables
How to read the change in statistics? (hierarchical regression)
- Compares Model 1 (e.g. x1 variable controlled) and Model 2 (x1 and x2 controlled)
- ΔR2: how much overall variance in y is explained by x2 after the effects of x1 is controlled for
-> e.g. what additional percentage did [x2] explain in variance of y after the effects of [x1] are controlled for - ΔF: Provides a measure of how much the model has improved the prediction of y (MSM), relative to the level of inaccuracy of the model (MSR) after the predictive power of Step 1 variables have been partialled out
IMPORTANT! change from simplest model to Model 1 is literally just Model 1 values
How to read statistics of each model from SPSS (hierarchical regression)
- Model Summary:
Assess the change in variance (including R, Adjusted R2, change in R2, and change in F) for each model - ANOVA: evaluating each model -> Assesses whether the overall regression model (with all predictors included at that step) accounts for significantly more variance than the simplest model
(b = 0) - Coefficients: evaluating each predictor within each model
* The intercept (a)
* The slopes (b) for each predictor variable
* Beta: the slopes converted to standardised slopes
* t-test
=> Assesses whether the model (the slope) for that individual predictor accounts for significantly more variance than the simplest model (b = 0)