Interpreting Measures Flashcards
Kendall’s tau
Correlation for Nominal data
Spearman’s rho
Correlation for ordinal data
Correlation
Degree of association between two sets of data
Does not:
Tell us the extent of agreement
Give a sufficient measure of reliability
Percent agreement
The extent to which observers agree in their ratings
Does not consider agreement that would occur by chance - Need similar frequencies in each category
Kappa Statistic
Indicates the proportion of agreement beyond that of expected chance (does not tell if there are random differences or systematic differences)
Used to express reliability for nominal or ordinal data
Only agreement beyond that expected by chance can be considered agreement
Represents ―average rate of agreement for the entire set of scores (doesn’t tell you where the discrepancies lie)
Examples: Inter-rater, Intra-rater, test-retest reliability
Kappa Scores
Range from –1 to 1
1: perfect agreement
0: agreement no better than if raters guessed
Negative K: agreement worse than expected by chance
Interpretation is always dependent on situation
>.80 is ―excellent
>.60 is ―substantial
.40–.60 is ―moderate
<.40 is ―poor to fair