Hierarchical Regression Flashcards

1
Q

Zero-order^2 correlation

A

(overall) variance explained by one of the variables (IV) expressed a a proportion of the total variance in y (outcome variable / DV)

e.g. same result as simple regression that only includes age as a predictor

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Part^2 correlation

A

unique variance explained by one variable expressed a a proportion of the total variance in y

(difference between part^2 and zero-order^2 is the variance also explained by other variable)

e.g. variance explained by a variable that isn’t explained by anything else

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

partial^2 correlation

A

unique variance explained by one variable as a proportion of the variance in y that remains after the variance explained by other predictors has been removed

  • if other variable completely overlap/within variable remove, no partial correlation
  • if other variable takes up large proportion of variance but is then removed, other variable has high partial correlation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

SPSS output for correlations

A

we must square the outputs they give us to understand variance explained… x100 for percentage

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

total explained variance shared by multiple predictors calculation

A

zero-order^2 - part^2 = shared variance
- should be the same for both variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

total explained variance calculation

A

BOTH part^2 + shared variance (one value as the same for each)

  • same value as R^2
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

value for unexplained variance

A

1-R^2
or 100- percentage of variance explained

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

partial^2 calculation

A

essentially removing all explained variance shared and all unique variance from other variable

= unique variance of variable (part^2) / unexplained variance (1-R^2) + unique variance (part^2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Hierarchical regression

A

predictor variables are entered in a specified order of ‘steps’ based on theoretical grounds of what it adds to the prediction of the outcome variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

why use hierarchical regression

A

when we want to examine the influence of a predictor variable on an outcome variable after ‘controlling for’ (partialling out) the influence of other variables

  • step one is what you are partialling out or controlling for or things that have already been established that you don’t want to affect your investigation of a previously unexplored or additional variable
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

SPSS output for hierarchical regression - model summary and R squared

A

model 1 where we only have predictor variable that is controlled (e.g. naughty list rating - step 1)

model two where we have both predictor variables (naughty list rating + age)

NOTE:
model 2 R squared - model 1 R squared = how much step 2 accounts for variation in levels of DV (e.g. introducing age at step 2 explained an additional 22% of variation in DV)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

SPSS hierarchical regression ANOVA

A
  • two ANOVA outputs
  • model 1 ANOVA as controlled variable only
  • model 2 as both variables
  • model 2 ANOVA assess if overall regression model with all predictors included is accounting for more variance than simplest model (p<.05)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

SPSS Hierarchical regression coefficients table

A

top for model 1 bottom for model 2

use model 2 for write up/same as multiple regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

SPSS hierarchical regression model summary table new statistics

A

use in write up for when R^2 for model 2 is significant

  • change statistics from step 1 to step 2 = from bottom (just controlled predictors) to top (both predictors)

use bottom row

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Hierarchical regression write up

A
  • no design: open with RESULTS heading
  • Hierarchical regression was used to investigate whether the DV can be predicted by secondary IV, after controlling for primary IV (one we already know has an influence). Descriptive statistics in table 1 (same as multiple regressions)
17
Q

primary results write up after table

A

The hierarchical regression was constructed to include (controlled variable) at step 1, and secondary variable at step 2 (one we don’t know). Preliminary analysis revealed that no violation of normality, linearity, multicollinearity or homoscedasticity.

18
Q

results write up ; models explained

A

X alone (Model 1) explained/did not explain a significant proportion of variance in DV, F + p (both ANOVA output) + R^2 (model summary table). Introducing X (second variable) at step 2 explained an additional X% (model summary r squared 2 - r squared 1) of the variation in DV, and this change in R squared was significant, change in F + p value (model summary bottom row).

19
Q

results write up final model

A

the final model including both predictors (state) explained X% (R square value from model summary for model 2 x100) of the variance in the DV. This was a significant/not significant result F + p (ANOVA model 2). IF RESULT SIGNIFICANT (if only one significant report that one) Regression coefficients revealed state association positive or negative then b + CIs + t + p (coefficients table BOTTOM TWO ROWS - both variables use values from model 2 for the association). Standard regression coefficients indicate that X variable is a stronger predictor than other variable. (only if significant state direction)

20
Q

discussion write up final

A

findings confirmed that DV increases/decreases with predictor we were testing, and this holds true even after the affects from predictor we expected has been controlled for.