Manipulations Flashcards

1
Q

Key Questions regarding Manipulation

A

How many factors (independent variables) should we manipulate in any single experiment?
How many levels or conditions should we include for each factor?
Should these vary continuously or discretely?
How do we design the experiment so that the variables aren’t confounded?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Three-Way ANOVA

A

Estimating the interactions requires stable estimates of the lower-level terms
ANOVA corrects for multiple comparisons within each factor - does not correct for number of interactions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Summary

A

Factorial designs - manipulate multiple independent variables (factors) in a single experiment
Typically measure all factors at all levels, allowing estimation of interactions between different variables
Estimate all terms in a factorial design using a linear - test effectif each factor using an F-test across the coefficients within each
Higher-order interactions are often difficult to interpret - may be best avoided unless you have a clear a priori hypothesis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Continuous Designs

A

More informative telling us about the condition and shape

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Summary

A

Factorial designs use continuous variables can teach us more about the shape of the relationship between the manipulation and the outcome variable
‘dose-response’ relationship can be helpful to descirbe the relationship, and also make predictions about conditions that were not measured directly
Estimating continuous relationships requires more data - factorial designs can be more statistically powerful to simply detect for the presence of an effect
Coninuous and categorical variables can be straightforwardly combined in a linear model - interaction between variables means that the slope of the continuous variables is different for different categories

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Correlated Variables in Linear Models

A

‘Multicolinearity’ - explanatory variables are correlated with each other
Data multicolinearity - 2 independent variables are correlated
Structural multicolinearity - model term using other terms in the same model
Reduce it as much as possible
Variables with no colinearity are orthogonal to each other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Disadvantages - Correlated Variables in Linear Models

A

Coefficient estimates can become very sensitive to the inclusion of other variables in the model
Worsens as the colinearity between variables increases
Precision of estimates typically decreases and increasing the standard error
Variables that are closely correlated are difficult to tell apart

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Advantages - Correlated Variables in Linear Models

A

Allows assessment for effect of one regressor while controlling the other variables in the same model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Conclusion

A

Multicolinearity in linear models is typically a problem - parameter estimates can depend on what other terms are included in the model
‘Structural multicolinearity’ - occurs when one term in a linear model is a transformation of another variables in the model
Possible to transform - such as by 0-centering variables - to remove colinearity and obtain explanatory variables that are orthogonal to one another

How well did you know this?
1
Not at all
2
3
4
5
Perfectly