Multiple Regression Flashcards

1
Q

What is collinearity?

A

Collinearity is when two or more predictors are highly similar. They CORRELATE highly with each other, meaning they may produce the same or very similar results

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is covariance?

A

Covariance is when the change in one variable is associated with the change in another one. For example, if a variable changes that has an effect on the other variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Why is multiple regression based on correlations data?

A

Because there is no direct manipulation of any predictor variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What data does multiple regression use?

A

Scale data: either ratio, interval or ordinal data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Why do we base building a multiple regression model off previous research?

A

We need a reason for including the predictors we are choosing; do we think they will have a direct influence on our criterion variable?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are assumptions of multiple regression? (Hint: there are 4 and they revolve around the data)

A

1) no outliers
2) normality of data
3) linearity of data
4 reliability of data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How is normality of data assessed in MR?

A

By looking at the skew and kurtosis. Normal distribution is between -2 and +2.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does MR do it there is not a linear relationship between the criterion variable and the predictors?

A

It will underestimate the relationship. This increases the type 2 error.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Why does having lots of predictor variables potentially violate an assumption of MR?

A

Having lots of variables can make the relationship between criterion and predictors non-linear

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How do we measure reliability in MR?

A

By using Cronback Alpha

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is Cronbacks Alpha? What does it measure?

A

It is a measure of internal consistency: it measures how closely related a group of items are in a set. It measures reliability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is homoscedacity? What is it called if you do not have it?

A

Where the variance is the same for all the predictor variables.

Hetroscedacity is what occurs if you do not have homodecsity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Why is hetrosecdasity bad for MR results? How do we test for hetrodecsity?

A

It is bad as it can lead to a distortion in our results, which leads to a type-1 error.

We check for it by looking at residuals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What will the correlation be it multicolinesrity is at a moderate level?

A

Between 0.3 and 0.8

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

How many variables (predictors) do you need to be able to do a multiple regression analysis?

A

Two or more independent variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How can you check for linearity between criterion and predictor variables in SPSS?

A

You can check this by looking at scatter plots.

17
Q

What table in SPSS output helps us see how well a regression model fits the data?

A

The “model summary” table

18
Q

How do residuals help us determine linearity in multiple regression?

A

They help us see if the standardised residuals are in relation to the models predictions. We can see if they are randomly spread or not

19
Q

How do you check for homoscedasiticity in multiple regression?

A

By looking at residuals, is it the same?

20
Q

When you have ran your MR and you are looking at the SPSS output tables, where do you look to assess for multicolinesrity (what table)? What value suggests multicolinesrity?

A

You look at the correlations table. Correlations greater than 0.8 suggest collinearity

21
Q

What is tolerance in MR?

A

How much the variance in the predictor is not explained by other predictors in the model; the unique contribution of the predictor is seen here

22
Q

Why do we want a high tolerance value?

A

A high value suggests minimal multicolinesrity

23
Q

What is VIF (variance inflation factor) in MR?

A

How much variance in the regression coefficient is caused by multicolinesrity.

24
Q

How can VIF be interpretated to overcome issues of multicooinesrity?

A

It can be a number by what to increase or sample size by to overcome issues of multicolinesrity.

A VIF of 2.5 would be 2.5x participants needed

25
Q

How can you check for outliers?

A

By looking at the line of best fit

26
Q

What value indictates a high level of consistency wth Cronbachs Alpha?

A

0.8

27
Q

How do you find the critical value?

A

50+8x( x is the number of participants)

28
Q

When might you use r2 to look at how much the variability in the criterion variable is explained by our model?

A

If sample size is greater or equal to critical value

29
Q

When might you use r2 adjusted value?

A

When looking to see how much the variability in the criterion variable is explained by our model you see if sample size is less than to critical value

30
Q

What is a beta value in multiple regression,

A

How strongly each predictor variables effects and influences the criterion variable

31
Q

What is a beta value measured in? (What unit)

A

It is measured in standard deviations

32
Q

What happens to the criterion variable if the beta coefficient for a predictor is 1 and it is statistically significant?

A

For every 1 that the predictor variable increases, while the other predictors are held constant, the criterion variable increases by one