New - Ch 5 (NoteLM) Flashcards
Discriminant Validity
The extent to which a measure does not correlate with measures of different constructs.
Convergent Validity
The extent to which a measure correlates with other measures of the same or similar constructs.
Criterion Validity
The extent to which a measure is related to an outcome or behavior that it should be related to.
Content Validity
the extent to which a measure covers all aspects of the construct it is intended to measure.
Face Validity
The extent to which a measure appears, on the surface, to measure what it is intended to measure.
Validity
The extent to which a measure accurately assesses the construct it is intended to measure.
Correlation Coefficient (r)
A statistical measure that quantifies the strength and direction of the linear relationship between two variables.
Ranges from -1 to +1, where -1 indicates a perfect negative correlation, +1 indicates a perfect positive correlation, and 0 indicates no linear correlation.
Kappa Coefficient
Measures inter-rater reliability or agreement between two raters for categorical variables.
Ranges from -1 to +1, where 1 indicates perfect agreement, 0 indicates agreement equivalent to chance, and negative values suggest less agreement than expected by chance.
Internal Reliability
Consistency of responses across multiple items within a measure.
Interrater Reliability
Consistency of scores obtained by different observers rating the same behavior or event.
Test-Retest Reliability
Consistency of scores on a measure across multiple administrations.
Measurement Error
The difference between the observed score and the true score, caused by factors that distort the measurement.
True Score
A hypothetical score that represents a participant’s actual standing on a construct, without any measurement error.
Observed Score
The score obtained on a measure, which includes both the true score and measurement error.
Reliability
The consistency of a measure.