An Introduction to Multiple Regression Flashcards

1
Q

How do you calculate a residual?

A

It is the observed value - predicted value.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is a Partial Correlation?

A

It is the correlation between two variables while controlling for a third.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a Semi-Partial Correlation?

A

It is the correlation between two variables while looking at the correlation between the third variable and one of those variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the 3 main things we can predict from a Multiple Regression model?

A

How well the model explains the outcome.
How much variance in the outcome our model explains.
The importance of each individual predictor.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are the 3 main types of Multiple Regression?

A
Forced entry (all data in at once).
Hierarchical (researcher decides variable order).
Stepwise (SPSS decides variables order).
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What program should you use to determine the sample size needed (which depends on the effect size)?

A

G*Power.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is R-Squared?

A

It is the variance accounted for by the model (the amount of variance in the DV the model explains).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How do we know if our model generalises well?

A

The closer R-Squared is to the Adjusted R-Squared the more accurate our model is likely to be for other samples.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Why is R not useful?

A

This is because in Multiple Regression we have several variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Why is the Standardised Coefficients Beta important?

A

It allows us to compare predictors to decide which are the most important. The higher the number the more important the variable as a predictor.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

When reporting the regression equation what are the coefficients also known as in SPSS?

A

Unstandardised B.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are the three assumptions of Multiple Regression pre-experiment?

A

The outcome variable should be continuous.
The predictor variable should be continuous or dichotomous.
There should be reasonable theoretical ground for including variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What are the four assumptions of Multiple Regression post-experiment?

A

Linearity.
Homoscedascity.
Normal distribution of residuals.
No multicollinearity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is meant by linearity?

A

There should be a linear relationship between each predictor and the outcome. Partial plots should be checked for this.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is meant by homoscedascity?

A

The variance of the residuals should be constant for all values of the predicted values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What shape indicates heteroscedasticity?

A

Funnel/cone shape.

17
Q

What graph should be looked at when checking for homoscedascity?

A

Graph of standardised residuals by standardised predicted values (ZRESID by ZPRED).

18
Q

What two graphs should be looked at when checking for normal distribution of residuals?

A

Histogram (should be bell shaped) + normal probability plot (points should be close to the diagonal).

19
Q

What two statistics should you look at to check for no multicollinearity?

A

Tolerance + VIF statistic.

20
Q

What are the tolerance + VIF statistic rules in order for there to be no multicollinearity?

A

VIF value should not be larger than 10.

Tolerance value should not be less than 0.1 (although 0.2 is already a concern).

21
Q

Why is multicollinearity an issue?

A

A good predictor might be rejected.

It may lead to errors in estimation of regression coefficients.

22
Q

What are two possible solutions for multicollinearity?

A

Combine predictors.

Remove one of the variables.

23
Q

What is an alternative indication of multicollinearity (not including the VIF + tolerance statistics)?

A

A high R-Squared with non-significant beta coefficients.

24
Q

Why must all the assumptions in Multiple Regression be met?

A

This is because otherwise they could affect the fit and generalisability of the model.