Week 5 & 6 MANOVA (Slides) Flashcards
To provide an overview of week 5 on MANOVA from the Lecture
Matrices underpin MANOVA and other analysis, what exactly are Matrices?
A matrix is a grid of numbers arranged in rows and columns.
- NB: The standard for quoting is rows then columns. e.g., 2 X 3 matrix means it has 2 rows & 3 columns.
- A simple Matrix may be one where rows represent a participant and columns represent measures.
How many different types of Matrices are there?
There are many, and differences depend on the type of Data and analysis undertaken.
- Data Matrix
- Correlation Matrix
- Sum of Squares & Cross-Products Matrix
- Variance-Covariance Matrix
- Residuals
A simple Matrix may be one where rows represent a participant and columns represent measures. How exactly does that work in practice?
A simple matrix shows values on a number of variables for each of several subjects & can be:
- Discrete variables (0/1) where numbers are codes for group membership: 0 = male, 1 = female, OR
- Continuous variables (e.g., Total Optimism) with values such as 15, 23, 25 etc. (i.e. assume underlying continuity)
What is a Square & an Identity Matrix?
If you have a square matrix, then there would be diagonal (runs top left to right bottom) and off diagonal (runs top right to bottom left) values. This is known as a square matrix
*an identity matrix would have all values equal to 1 on the diagonal (e.g. correlation matrix - when correlate with themselves = identity matrix)
Tell me about a correlation matrix?
- A correlation matrix is unit free
- Unit Free means it reflects the relationships between variables but does not provide information about the relative size of the units of measurement in measures (Mastery or Self Esteem)
- Based on Pearson’s r.
How do we derive correlations?
- Correlations are derived from the sum of squares (SS) [Diagonal] and cross-products matrix (sum of products : SP) [known as an S Matrix]
- Each SS is divided by itself. Hence the 1.00s in the correlation matrix on the diagonal
- Each cross-product (SP) [off-diagonal] is divided by the square root of the product of the sum of squared deviations around the mean for each variable in the pair
What is an S Matrix?
An S Matrix is Sum of squares (SS) [Diagonal] and cross-products matrix (sum of products : SP)
*NB An S Matrix, is one where the number of scores and measurement size determine the size of entries
If the scores are measured on a meaningful scale, do we still calculate a correlation matrix?
NB a meaningful scale is one that has a unique and non-arbitrary zero value
In this case we would use a variance-covariance matrix (SIGMA) which gives us information about:
1. Variance of each variable (diagonal) rather than 1 where the variance = averaged squared deviations of each of the scores from the mean of scores
2. Covariance between pairs of variables – how much they covary with a value relative to the scale’s value that is being used.
NB this is Week 8 Moderation & Mediation
How do we calculate a variance-covariance matrix (AKA SIGMA) ?
We calculate a variance-covariance matrix (SIGMA) by dividing each element in the S matrix by n-1
What important things do we need to remember about the variance-covariance matrix (SIGMA Matrices)?
- With a variance-covariance matrix (SIGMA) the size of the entries is influenced by the measurement: Scores that are measured in large numbers tend to have large variances, small numbers result in small variances
- Deviations are averaged, so the number of scores does not have an impact
- Covariances are similar to correlations but they retain information about the scales used to measure the variables
What is an important thing to remember about SIGMA (Variance-Covariance), S & R (Pearson) Matrices?
They are all square matrices so they are symmetrical
i.e. they are mirror images above and below the diagonal
Remind me the principle of ANOVA….
When we use ANOVA we are separating the amount of variance in the equation to find F.
*Difference between the TOTAL VARIANCE in the data. *We separate variance into difference BETWEEN the groups and difference WITHIN the groups (Error Variance).
What is MANOVA?
- MANOVA is an extension of ANOVA used where there are 2 or more DVs.
- Using MANOVA rather than multiple ANOVAs controls for familywise error across the multiple tests (just as ANOVA is preferred to a series of t-tests).
- If MANOVA is significant, researchers typically proceed to the univariate ANOVAs & then to analytical comparisons where necessary
- If MANOVA is not significant no further tests are conducted
So I have multiple DVs, should I always run a MANOVA?
- No, there’s considerable controversy with its use – particularly as ANOVAs are often used after its analysis in interpretation.
- The DVs need to be at least moderately correlated, otherwise it wouldn’t be logically or appropriate to evaluate the linear combination of them.
- MANOVA is argued to have a lower power than multiple ANOVAs using a Bonferroni correction when the DVs are uncorrelated
When is a good time to use MANOVA?
- You need a sound theoretical reason for its use.
- MANOVA can be more powerful than a series of ANOVAs when small differences on individual DVs combine to produce an overall significant effect
- A smaller number of DVs is preferable due to the complexity in interpretation in explaining the combination when & if significant.
- To protect against Type 1 error
- To evaluates differences among groups when it is the linear combination of DVs
- use MANOVA to create a linear combination of DVs to maximise mean group differences