MR Chapter 7 points to take home Flashcards
How do we handle categorical variables in MR?
Dummy coding (focus of this course) - the variable is converted into g-1 dummy variables that capture the information contained in all the categories (g)
Relations between ANOVA and MR in interactions
With continuous DV and a categorical IV, the results will be the same whether analyzed by
ANOVA or MR
- R2 (from MR) = 2 (from ANOVA)
- F-values are the same
what’s the relationship between regression and post hoc tests?
Regression estimates can be used for post hoc comparisons, where
- intercept=mean on the DV of the control/reference group, and
- slope=deviation from the mean for the non-reference group coded 1
Steps to get interaction
- Center (or standardize) the IV of interest. These manipulations are needed to prevent the so-called problem of multicollinearity
- › Multiply the centered (or standardized) IV by the dummy variable(s) to create cross-products
- › Regress the DV on the IVs. Use the centered (or standardized) IVs (which were used to test for interactions)
- Add the cross-products sequentially
- Check the statistical significance of ΔR2
- Statistically significant ΔR2 indicates the presence of statistical interaction. - The same information is given by significant b-value for the interaction term
- If significant, graph and interpret, do follow up if needed
- If not significant, interpret findings without the addition of the cross-product.
Mean-corrected variables
Mean-corrected variables have a mean of 0 and a standard deviation equal to that of the uncorrected variable, while the maximum and minimum scores can be obtained by subtracting the mean from the maximum and minimum of the uncorrected variables: e.g., minimum for MC BMI is 18.52 - 23.58 = -5.06.
One of the assumptions of multiple linear regression analysis is that the DV is a linear function of the IVs. This means that:
non-linear relationships between the DV and the IVs can be modelled by (i) transforming the IVs appropriately, and (ii) including the relevant polynomial term in the regression model. E.g., including the square of an IV to model a quadratic relationship:
Y = a + b1X1 + b2X22 + e.
if the R2 for the interaction in a seq reg is .202 what does this mean?
the interaction term together with the two individual predictors account for 20.2% of the variance in the DV, not the interaction term by itself.
what is a mediator?
The researcher argues that the effects of delay on response rate occur because the longer the delay, the greater the likelihood of interfering events occurring. Thus, delay might be mediated by interfering events.
How to do an interaction regression with continuous and categorical variables.
conducting a sequential regression with MATH as the DV is a good strategy for this type of research question.
Given the current research design and research questions, Model 1 should indeed include variables that are controlled for (e.g., SEX, AGE, and GRADE) and then enter the IVs of particular interest (in this case HOMEWORK and CONSC) as additional variables for Models 2. This would allow for the easy assessment of the joint predictive value of these variables, over and above the variance explained by the variables already in the model. Importantly, given the researcher’s suspicion, the interaction term between HOMEWORK and CONSC should indeed be included as a final block in the overall regression Model.
Consider this regression model
Ŷ = a + b1X1 + b2X12
reflects both the linear and quadratic trends in Y as X1 increases, where b1 indexes the linear relationship between Y and X1, and b2 indexes the quadratic relationship betweenY and X1.
Is it that The zero-order correlation between the mean-corrected IVs will generally be lower than the correlation between the two original IVs?
No, Since both IVs undergo a linear transformation when they are mean-corrected, the correlation between the two individual IVs will be the same whether they are mean-corrected or not.
The zero-order correlation between the mean-corrected IVs and their interaction term will generally be lower than the correlations between the original IVs and their interaction term.
The correlations between mean-corrected IVs and the interaction term are lower than those between non-corrected IVs and the interaction term, thus reducing multicollinearity.