ch9 Flashcards
Multivariate designs
involve more than two measured variables
two types of multivariate design
- Longitudinal designs
- Multiple-regression
Longitudinal designs
provides evidence for temporal precedence by measuring the same variable in the same people at several different times
types of correlations from longitudinal designs
- Cross-sectional correlations
- Autocorrelations
- Cross-lag correlations
Cross-sectional correlations
tell us whether two variables measured at the same time are correlated
Autocorrelations
when a variable correlates with itself across time
Cross-lag correlations
- show whether an earlier measure of one variable is associated with a later measure of the other variable
- helps establish temporal precedence/ directionality problem
how do you know which variable comes first from a cross-lag correlation?
If earlier variable A’s correlation to B’s later on is statistically significant and earlier B’s correlation to A’s later on isn’t, we can know that A comes before B
whichever relationship is statistically significant is the direction of the relationship
Three possible patterns for cross-lag correlations
- A comes before B
- B comes before A
- Mutually reinforcing: both correlations are statistically significant, indicating a cycle in which each variable reinforces the other continuously
do longitudinal studies address the third variable problem?
- hen conducted simply, longitudinal studies only measure the two key variables and can’t rule out a third variable
- Might be able to design their studies in particular ways or do statistical analyses to address some third variables
Multiple regression (multivariate regression)
a statistical technique to help rule out some third variables, addressing internal validity concerns
Criterion variable
-dependent variable
- the variable in multiple regression they are most interested in understanding or predicting/ outcome of interest
- Specified in either top row or title of regression table
Predictor variables
-independent variables
- the rest of the variables measured in a regression analysis that may cause the criterion variable
-Although it is considered an independent variable, it isn’t manipulated, and causation cannot be inferred
Beta
- similar to r, in that it denotes direction and strength of relationships. Represents the relationship between the criterion variable and the predictor variable
direction of beta
- Positive beta means a positive relationship when other predictor variables are statistically controlled for, and vice versa
- A beta that is zero or not statistically different from zero represents no relationship when other predictors are statistically controlled for
can you compare betas on different regression tables?
Within a single regression table, we can usually compare predictor variables but not across different regression tables
- No absolutes for effect sizes (like cohen’s for r)
how do you measure effect sizes of beta?
R squared is a measure of effect size- regression analyses (software) often includes measure of r squared- gives you information about effect size for each of the predictor variables
The coefficient b
The coefficient b
same as beta except it hasn’t been standardized to a scale (between 1 and -1)
- so it can’t be compared even within same table
Statistical significance of beta
Show us whether or not the results are due to chance/sampling error- whether or not they actually come from a population where the relationship is zero
how is statistical significance marked on a regression table? what values
Regression tables have column labeled sig or p, or asterisked footnote with p value for each beta
Less than .05 = statistically significant
If more than .05 data isn’t significant
what does adding more predictors to a regression table do? (2)
- Can help control for several third variables at once (closer to causal claim)
- Examines the betas for all the other predictor variables to get a sense of which factors most strongly predict chance of our criterion variable
when can beta exceed an absolute value of 1?
- Multicollinearity- where predictors are so strongly correlated that you can’t separate the contribution of each
- Or when variables aren’t continuous variables/ or dichotomous yes or no variables
Multicollinearity
where predictors are so strongly correlated that you can’t separate the contribution of each
R squared
measure of effect size- regression analyses (software) often includes measure of r squared- gives you information about effect size for each of the predictor variables