Hierarchical Regression Flashcards
Zero-order^2 correlation
(overall) variance explained by one of the variables (IV) expressed a a proportion of the total variance in y (outcome variable / DV)
e.g. same result as simple regression that only includes age as a predictor
Part^2 correlation
unique variance explained by one variable expressed a a proportion of the total variance in y
(difference between part^2 and zero-order^2 is the variance also explained by other variable)
e.g. variance explained by a variable that isn’t explained by anything else
partial^2 correlation
unique variance explained by one variable as a proportion of the variance in y that remains after the variance explained by other predictors has been removed
- if other variable completely overlap/within variable remove, no partial correlation
- if other variable takes up large proportion of variance but is then removed, other variable has high partial correlation
SPSS output for correlations
we must square the outputs they give us to understand variance explained… x100 for percentage
total explained variance shared by multiple predictors calculation
zero-order^2 - part^2 = shared variance
- should be the same for both variables
total explained variance calculation
BOTH part^2 + shared variance (one value as the same for each)
- same value as R^2
value for unexplained variance
1-R^2
or 100- percentage of variance explained
partial^2 calculation
essentially removing all explained variance shared and all unique variance from other variable
= unique variance of variable (part^2) / unexplained variance (1-R^2) + unique variance (part^2)
Hierarchical regression
predictor variables are entered in a specified order of ‘steps’ based on theoretical grounds of what it adds to the prediction of the outcome variable
why use hierarchical regression
when we want to examine the influence of a predictor variable on an outcome variable after ‘controlling for’ (partialling out) the influence of other variables
- step one is what you are partialling out or controlling for or things that have already been established that you don’t want to affect your investigation of a previously unexplored or additional variable
SPSS output for hierarchical regression - model summary and R squared
model 1 where we only have predictor variable that is controlled (e.g. naughty list rating - step 1)
model two where we have both predictor variables (naughty list rating + age)
NOTE:
model 2 R squared - model 1 R squared = how much step 2 accounts for variation in levels of DV (e.g. introducing age at step 2 explained an additional 22% of variation in DV)
SPSS hierarchical regression ANOVA
- two ANOVA outputs
- model 1 ANOVA as controlled variable only
- model 2 as both variables
- model 2 ANOVA assess if overall regression model with all predictors included is accounting for more variance than simplest model (p<.05)
SPSS Hierarchical regression coefficients table
top for model 1 bottom for model 2
use model 2 for write up/same as multiple regression
SPSS hierarchical regression model summary table new statistics
use in write up for when R^2 for model 2 is significant
- change statistics from step 1 to step 2 = from bottom (just controlled predictors) to top (both predictors)
use bottom row
Hierarchical regression write up
- no design: open with RESULTS heading
- Hierarchical regression was used to investigate whether the DV can be predicted by secondary IV, after controlling for primary IV (one we already know has an influence). Descriptive statistics in table 1 (same as multiple regressions)
primary results write up after table
The hierarchical regression was constructed to include (controlled variable) at step 1, and secondary variable at step 2 (one we don’t know). Preliminary analysis revealed that no violation of normality, linearity, multicollinearity or homoscedasticity.
results write up ; models explained
X alone (Model 1) explained/did not explain a significant proportion of variance in DV, F + p (both ANOVA output) + R^2 (model summary table). Introducing X (second variable) at step 2 explained an additional X% (model summary r squared 2 - r squared 1) of the variation in DV, and this change in R squared was significant, change in F + p value (model summary bottom row).
results write up final model
the final model including both predictors (state) explained X% (R square value from model summary for model 2 x100) of the variance in the DV. This was a significant/not significant result F + p (ANOVA model 2). IF RESULT SIGNIFICANT (if only one significant report that one) Regression coefficients revealed state association positive or negative then b + CIs + t + p (coefficients table BOTTOM TWO ROWS - both variables use values from model 2 for the association). Standard regression coefficients indicate that X variable is a stronger predictor than other variable. (only if significant state direction)
discussion write up final
findings confirmed that DV increases/decreases with predictor we were testing, and this holds true even after the affects from predictor we expected has been controlled for.