L22 Multiple regression (chapter 8 part 2) Flashcards

1
Q

What is multiple regression?

A

It’s a linear approach for modelling the relationship between scalar dependent variable y and more predictory variables denoted x
- outcome = model prediction + error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the formula for multiple regression?

A

Picture 1
Uses linear predictor functions whose unknown model parameters Bs are estimated from the data
- the same as for single regression, we just add a second (third…) predictor and a second (third…) beta coefficient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the important thing to remember when we are working with multiple predictors?

A

When we are working with main effects we always have to assess interaction effects because if an interaction effect is going on, we cannot simly interpret the main effect by itself

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is one way to asses how our model did?

A

Look at what does my model predict and what is the outcome and make a scatterplot between the two which shows the correlation; picture 2
- if we square this correlation we will have the proportion of explained variance
- but we don’t know whether the r^2 is large enough - we have to test for significance to reject the null (comes later)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Assumptions for multiple regression

A

All the ones for single regression (sensitivity, homoscedasticity, linearity) + multicollinearity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is multicollinearity?

A

When our predictor variables are associated
- causes problems in interpretation of our results so that we are not able to distinguish the explained variance
- little multicollinearity is unavoidable and it poses little threat to the model estimates, but as collinearity increases, problems start to arise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How do we assess multicollinearity?

A
  1. Correlations between predictors
  2. Matrix scatterplot
  3. Collinearity diagnostics
    ↪ VIF (strong linear relationship between the predictors?): max < 10, mean < 1
    ↪ Tolerance (VIF reciprocal - 1/VIF) > 0.2 = good
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How do we deal with multicollinearity?

A

By modelling this dependence between our predictors, using mediation analysis

Not talked about during this lecture

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Let’s say we have 2 predictors (two independent variables) how does it gonna look in a formula and how many b values do we have?

A

Picture 3
We get 3 b values - b0, b1 and b2 using sums of squares
We can convert those to t statistic to assess their significance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How do we apply the regression model using formula from picture 3?

A
  • for each observation (each album) we can fit the data of each value that the predictors predicted (airplay, adverts) and calculate the expected outcome (model prediction) by using the calculated bs
  • picture 4
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How can we visualise multiple predictors?

A

We need to plot in as many dimensions as we have predictors
- When we plot 1 predictors + DV, we plot in 2 dimensions, and we summarize the relationship by a line
- When we plot 2 predictors + DV, we plot in 3 dimensions, and we summarize the relationship by a plane
- But visualisation becomes more and more tricky when we have more and more predictors - that’s why we want to keep the number of predictors to minimum and only use those that are backed up by theory and previous research

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How do we interpret the plane visualising 2+1 predictors?

A

The higher the association between 1 predictor variable and the dependent variable, the steeper the slope of this plane is in one dimension and for the other predictor in the other dimension

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is hierarchical regression?

This is explained in JASP file in more depth and the procedure as well

A

Method of entering predictors into the model based on their importance in predicting the outcome
- we select predictors based on prior research and theory and we decide in which order to enter them into the model
- generally, known predictors go first, in order of their importance in predicting the outcome

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How can we compare which model has a better model fit?

A

In JASP, we just look at R^2 change and it has F statistic attached to it and the larger one represents better model fit
- There is also formula for F change, picture 5, where k is the number of predictors, N the sample size
- But it’s better to use AIC and BIC

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is BIC and AIC?

A

BIC - Bayesian Information Criterion
AIC - Akaike information criterion
They are measures of model fit that penalize the model for having more variables
- the smaller the numbers for AIC and BIC, the better model fit
- the numbers on their own don’t mean anything, there is no cut-off where we would say this is the perfect model; they only make sense as comparisons between two or more models

More about this in JASP file

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How do we report results of linear models?

A

Best way to do it is through the summary table that JASP gives us but there are few things we can report by words:
- betas along with their standard errors and confidence intervals
- significance value and standardized beta
- R^ or AIC/BIC

17
Q

JASP demonstration

A

NOW go to the beautiful JASP file I sent in whatsapp and go through it. I think this is the most important thing from the whole lecture, he spent half of the lecture on it.
If you can’t read something let me know <3