Week 9 Flashcards

1
Q

What are the assumptions of regression models?

A
  • use correct variables for the technique (for linear, we want interval level data, equal distances between each point of the scale)
  • independence of data (and error terms)
  • sample size and normality (of variables and residuals)
  • linearity
  • homoscedacity (of residuals)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

In regressions models, the ____ the sample size the better.

A

Larger

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

You can check for non linearity using a:

A

plot. Select statistics for correlation values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is residuals another way of saying?

A

Pretty much saying how crappy your data is.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does checking the pattern of model mis-fit check in linear regression?

A

Scedacity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Under assumption checks, what do you tick if you are wanting to look at if your model is good at all values in a linear regression?

A

Tick residual plots, if your data is equally dispersed and shows no real pattern, this is good at predicting scores across the spectrum of scores of the predictor variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is homoscedacity?

A

It means that the error variance should be the same at each level of the predictor variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Rather than eyeballing plots, how else can we test heteroscedacity?

A

Under assumption checks. If we have a significant value, this is BAD. WE have violated the assumption of homoscedacity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What type of designs do even a small amount of outliers have a huge effect for?

A

Linear regression models

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Can we ever remove unusual scores (outliers)?

A

people have strong opinions about this.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

how can we test to see if we have any unusual outliers? (2)

A
  1. Using cooks distance (you can save this to the data set)

2. any residual value >3 SD’s from the mean manually

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Simultaneous multiple regression examines the ____ ability of a set of predictor variables in accounting for variability in a response variable

A

combined

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

In a hierarchical multiple regression, what is in model 1?

A

Everything that was in block one.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

In hierarchical multiple regression, what is included in model 2?

A

Everything that was in block one, plus that which was i block two.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What explains the percentage of the variation explained by the models in hierarchical multiple regression?

A

Adjusted R squared.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

In model comparisons, what demonstrates the percentage of variation ADDED to the model by block 2?

A

The triangle R squared.

17
Q

Hierarchical multiple regression is ____ driven:

A

hypothesis

18
Q

When we are trying to construct the optimal regression model, what are the 3 ways we could do this?

A
  1. forced entry
  2. forward stepwise entry
  3. backward stepwise removal
19
Q

What is forced entry in a multiple regression?

A

Putting all predictors into the model at once, just like simultaneous multiple regression

20
Q

What is forward stepwise entry?

A

putting the best predictors into the model first, then only entering more predictors if they improve the quality of the predictive model (i.e., significantly increasing the R squared)

21
Q

What is backward stepwise removal?

A

starting with all the predictors in the model, then start throwing out the worst predictors until this has a negative impact on the quality of the predictive model (i.e., significantly reducing the R squared).

22
Q

Why are some people against stepwise methods?

A

Because they’re data driven, we can capitalise on the results, even if it makes no sense theoretically.

23
Q

After we put everything in the model at the same time in a stepwise multiple regression, what is the next step?

A

Order the t values (positive or negative) in how big the effect is. This will help to know how you should order the stepwise analysis.

24
Q

What does the t-test tell you in model coefficients (stepwise multiple linear regression)?

A

How certain you can be that the coefficient is different from 0. Enter them all in difference blocks one by one (1 IV in each block).

25
Q

What is the difference between hierarchical multiple regression and stepwise regression?

A

Hierarchical is theory driven whilst stepwise is data driven