L22 Multiple regression (chapter 8 part 2) Flashcards
What is multiple regression?
It’s a linear approach for modelling the relationship between scalar dependent variable y and more predictory variables denoted x
- outcome = model prediction + error
What is the formula for multiple regression?
Picture 1
Uses linear predictor functions whose unknown model parameters Bs are estimated from the data
- the same as for single regression, we just add a second (third…) predictor and a second (third…) beta coefficient
What is the important thing to remember when we are working with multiple predictors?
When we are working with main effects we always have to assess interaction effects because if an interaction effect is going on, we cannot simly interpret the main effect by itself
What is one way to asses how our model did?
Look at what does my model predict and what is the outcome and make a scatterplot between the two which shows the correlation; picture 2
- if we square this correlation we will have the proportion of explained variance
- but we don’t know whether the r^2 is large enough - we have to test for significance to reject the null (comes later)
Assumptions for multiple regression
All the ones for single regression (sensitivity, homoscedasticity, linearity) + multicollinearity
What is multicollinearity?
When our predictor variables are associated
- causes problems in interpretation of our results so that we are not able to distinguish the explained variance
- little multicollinearity is unavoidable and it poses little threat to the model estimates, but as collinearity increases, problems start to arise
How do we assess multicollinearity?
- Correlations between predictors
- Matrix scatterplot
- Collinearity diagnostics
↪ VIF (strong linear relationship between the predictors?): max < 10, mean < 1
↪ Tolerance (VIF reciprocal - 1/VIF) > 0.2 = good
How do we deal with multicollinearity?
By modelling this dependence between our predictors, using mediation analysis
Not talked about during this lecture
Let’s say we have 2 predictors (two independent variables) how does it gonna look in a formula and how many b values do we have?
Picture 3
We get 3 b values - b0, b1 and b2 using sums of squares
We can convert those to t statistic to assess their significance
How do we apply the regression model using formula from picture 3?
- for each observation (each album) we can fit the data of each value that the predictors predicted (airplay, adverts) and calculate the expected outcome (model prediction) by using the calculated bs
- picture 4
How can we visualise multiple predictors?
We need to plot in as many dimensions as we have predictors
- When we plot 1 predictors + DV, we plot in 2 dimensions, and we summarize the relationship by a line
- When we plot 2 predictors + DV, we plot in 3 dimensions, and we summarize the relationship by a plane
- But visualisation becomes more and more tricky when we have more and more predictors - that’s why we want to keep the number of predictors to minimum and only use those that are backed up by theory and previous research
How do we interpret the plane visualising 2+1 predictors?
The higher the association between 1 predictor variable and the dependent variable, the steeper the slope of this plane is in one dimension and for the other predictor in the other dimension
What is hierarchical regression?
This is explained in JASP file in more depth and the procedure as well
Method of entering predictors into the model based on their importance in predicting the outcome
- we select predictors based on prior research and theory and we decide in which order to enter them into the model
- generally, known predictors go first, in order of their importance in predicting the outcome
How can we compare which model has a better model fit?
In JASP, we just look at R^2 change and it has F statistic attached to it and the larger one represents better model fit
- There is also formula for F change, picture 5, where k is the number of predictors, N the sample size
- But it’s better to use AIC and BIC
What is BIC and AIC?
BIC - Bayesian Information Criterion
AIC - Akaike information criterion
They are measures of model fit that penalize the model for having more variables
- the smaller the numbers for AIC and BIC, the better model fit
- the numbers on their own don’t mean anything, there is no cut-off where we would say this is the perfect model; they only make sense as comparisons between two or more models
More about this in JASP file