Hierarchical Multiple Regression Flashcards
rules for hierarchical MR/when to use
provides best prediction of criterion variable from predictors
predicted values are plotted on regression line passing through scatterplot
regression line attempts to minimise residuals
what does the regression model do?
statistically predicts outcome based on correlations on predictors
regression model is a combination of…
correlation coefficients that maximise variance explained by predictors
regression equation
Ŷ = b0 + b1x1 + b2x2… +error
Y is the criterion/outcome
B0 is intercept
B1 is coefficient associated with predictor
X1 is name of predictor
What does multicollinearity do to beta errors and R and what does this mean?
increases beta errors, limits size of R and makes determining predictor importance difficult
addressing issues with multicollinearity
check errors in data entry and coding
reduce the number of predictor variables
delete predictors highly related to other predictors
what is an outlier
substantially different value
impact of outliers
large impact on results of regression analysis
how are outliers detected?
scatterplot and residual plots
identifying outliers on SPSS tables
residual statistics table (standardised residuals, Cooks distance)
what is a standardised residual
rescaling residual by residual/std. deviation
what do standardised residuals do?
help identify where predicted score is quite different to actual score
identifying standardised residuals
fall outside +/- 3.3
Tabachnick and Fidell (2007)
what does Cooks distance show
influence of deleting a case
value for Cooks distance and meaning
> 1 are large values
brief explanation of how hierarchical MR is done in SPSS
sequential
predictors entered in equation in specified order (blocks)
blocks assessed on what it adds to prediction
steps in analysing hierarchical MR
check assumptions are met
assess model overall
evaluate predictor variables
formally report results
check assumptions are met step
sample size - N>50+8M
multicollinearity - tolerance>.5, VIF<10
normality - histogram
linearity - p-plot
homoscedasticity - equal variances
outliers - residual stats, std. residual +/-3.3, cooks D <1
assess model overall step
variables entered/removed - see whats included
model summary - adj. R2 = % variance
R2 change - effects after variable removed
ANOVA table - F( , ) = ,p<0.001
decimal found in F column (brackets found for between and within subjects figures)
evaluating predictors step
coefficients table
regression equations
formally reporting the results of hierarchical MR
A hierarchical MR was used to assess the ability of the predictor (name) on the criterion (name) after controlling for (other predictor variables)
Preliminary analyses were conducted to ensure no violation of the assumptions of normality, linearity, multicollinearity, homoscedasticity and sample size (state whether assumptions were met or not).
(name predictor variables) were entered into model 1, explaining (adj. R2)% of the variation in (criterion variable) (adj. R2 = ), F( , ) = ,p<0.001 (state which variables were/were not significant predictors of the criterion variable) (P<0.001).
After entry of (name another predictor variable) in model 2, the model explained (more/less) variance,
F( , ) = ,p<0.001
The total variance in (criterion) explained by the model as a whole was (adj. R2)% (adj. R2 = ). (state significant/non-significant predictors in model 2) (p<0.001)
Then interpret the results - explain what they mean in words.