Module 3 Flashcards
Correlation does what
evaluates the relationship between variables
correlation does not mean causation
positive correlation vs negative correlation
positive = increase and decrease together
negative = inversely related as one increases the other decreases
How can correlations be looked at
scatterplot or correlation coefficient
Variable in correlation studies
not an independent or dependent variable since you are simply looking at how they are related to one another
positive linear relationship vs negative
How does it look on scatterplot
line goes up in positive
line goes down in negative
correlation coefficient
describes the strength and direction of a relationship
what is a -1 vs +1 in correlation coefficient
-1 perfect negative relationship
+1 perfect positive correlation
Pearson Correlation Coefficient assumptions
independence
normality
variables are interval/ratio level data
Pearson’s correlation coefficient null hypothesis
There will be no relationship between… and …
Pearson’s output
r is correlation p is significance
r near 1 so near perfect positive correlation
Spearman rank correlation coefficient assumptions
nonparametric test
assumptions - independence
DOESN’T require interval or ratio level data or normality
spearman rank correlation - what type of data can be used
ordinal (like a Likert scale) agree, disagree, etc.
Independent dependent variable in correlation coefficients?
none. You are testing the relationship so there is not a predictor or outcome variable
Spearman CC output
What does this mean
negative or inverse relationship between the tested groups. -0.392 not a strong correlation but significant since p value was <0.000.
How do you read this table
numbers on column correspond to numbers on side. *** shows significance
When you see using one variable to predict another what does that me
regression
simple linear regression assumptions
normality, homogeneity of variances, DV interval/ratio
Regression line equation =
Y = a + bX
Y = dependent variable
a constant from output
b slope from output
X = independent variable
How to know if you can do a regression?
if the two things tested are correlated then can do regression analysis
What is R in regression analysis
What is R squared
R = correlation coefficient anywhere from -1 to +1
R square = percent of variance that can be explained by the variable being tested 0.141 = 14.1%
To get line regression equation in regression analysis, which column do you look at?
B column
constant row = a
independent variable label is b
Y = -9473.852 + 3349.145 X
You would input x using dependent variable to get Y
When are nonparametric tests used
When assumptions are not met for parametric tests
normally distributed
homogeneity of variances
Can be used to examine nominal and ordinal level data
often do not analyze raw data
Less powerful than parametric tests (unless assumptions are not met then they are more powerful)
When comparing 2 independent groups what test would you use for parametric and nonparametric
parametric - T test for independent samples (evaluates means)
nonparametric - Mann-Whitney U test
use with ordinal data, inhomogeneity of variances, normal distribution(evaluates medians)
Mann-Whitney U tests what
whether the medians differ significantly between two groups
Mann-Whitney U test assumptions
random samples
independence of variables
DV is ordinal, interval, or ratio
Interpret the output
U = 0.500, p< 0.000
knowledge levels in the APRN group were significantly higher than knowledge levels in the brochure group
Assumptions not met with paired t-test what would you use
Wilcoxon Signed-Rank test
Wilcoxon signed rank test evaluates what
evaluates the rank for the difference between measurements
Rank from smallest difference to the highest rather than scores
Wilcoxon signed rank test assumptions
random samples
DV interval or ratio level data
NOT INDEPENDENT GROUPS (pre and post test)
Interpret output for wilcoxon signed rank test
Fail to reject the null
There was no statistically significant difference in back pain before and after yoga intervention
Z=-1.807, p = 0.071
Comparing more than two groups nonparametric test what wouldn’t it be and what would you use instead
NOT ANOVA compares means
WOULD USE Kruskal-Wallis Test compares medians
Kruskal-Wallis test assumptions
random samples
independence
DV is ordinal, interval/ratio level
Interpret kruskall wallis output
H(2) = 2.317, p = 0.314
(2)= degrees of freedom which is # of groups minus 1
fail to reject the null hypothesis. there is no statistically significant difference
what test would you use if you wanted to compare categorical variables (nominal or ordinal)
Chi-square test
Chi-square assumptions
independence
expected counts are greater than 1 and no more than 20% of cells are less than 5
If Chi-square assumptions aren’t meet what would you use
Fisher’s Exact Test
Can you use chi-square for with this? expected counts are greater than 1 and no more than 20% of cells are less than 5
Yes because numbers in cells are not less than 5
Interpret
You can use Chi-square because expected counts are greater than 1 and no more than 20% of cells are less than 5 which is displayed in b.
x2 (1) = 0.047, p = 0.828
fail to reject the null hypothesis and say there is not statistically significant relationship