Clinical Outcome Measures Flashcards
4 Pyschometric properties of reliability
Internal consistency
Test-retest
Intra-rater
Inter-rater
Internal Consistency
Definition
Consistency of construct across individual items of outcome measure
Internal Consistency
Study Design
Conduct outcome measure on group of people-> analyze intra-subject correlation between them
Internal Consistency
Statistical Results
Chronbach alpha
measures correlation
Ideal: 0.7-0.9
Internal Consistency
Appraisal Considerations
Sample size
Participants wide diversity in outcome measure
Test-retest
Definition
Consistency of test when given to a person (unchanged in outcome) on two different occasions
Test-retest
Study Design
One person gives test to same people on different days
Test-retest
Statistical Results
ICC or Kappa
Closer to 1 better
Test-retest
Appraisal considerations
Sample size
Wide diversity in outcome measure
Intra-rater
Definition
Consistency of raters when compared to themselves on two different occasions
Intra-rater
Study Designs
Several therapists give test to same people at different times
Intra-rater
Statistical Results
ICC or Kappy
Closer to 1 better
Intra-rater
Appraisal considerations
Sample size Diversity in outcome measure Stable in characterstic of interest Same circumstances for assesment each time Time appropriate between assessments
Inter-rater
Definition
Consistency of raters compared to each other
Inter-rater
Study Design
Therapists measure same participants and their scores are compared
Inter-rater
Statistical Results
ICC or Kappa
Closer to 1 better
Inter-rater Appraisal considerations
Sample size Diversity in outcome measure Stable in characteristic of interest Same circumstances each assesment Time appropriate between assesments
3 Main groups of Validity and their subcategories
- Content
- face - Criterion
- concurrent
- predictive - Construct
- convergent
- discriminative
- known groups
Content Validity Characteristics
Def: includes all the content required
Design: expert panel conducts extensive assesment
Statistics: none
Considerations: diversity of experts, several rounds of giving opinions, transparent process
Face Validity Characteristics
Def: makes sense to me
Design: read it, think, done
Stats: none
Consider: diversity of experts, transparent process
Criterion Validity Characteristics
Def: compare to established measure
Design: measure OM of interest to established OM
Stats: Spearman rho or Pearson Correlation Coeff
-1 to 1
Criterion (concurent and predictive) Validity Appraisal Considerations
Gold standard credible?
Blinding?
Everyone does both assessments?
Time appropriate between assessments?
Concurrent Validity Characteristics
Def: two measures correlate at same time point
Design: comparison at same time
Stats: spearman rho
-1 to 1
Predictive Validity
Charactersitics
Def: OM of interest is correlated with another OM at later point
Design: OM of interest is compared to later OM
Stats: Spearman Rho
-1 to 1