Ch. 15 Flashcards
Integrator reliability
Two or more individuals should make an observation or one observer should examine the behavior on more than one occasion
Validity
Measure accurately = does the instrument measure what it claims to measure?
Content validity
Criterion related, construct
Concurrent validity
Degree of correlation between two measures
Predictive validity
Measure of concept and some future measure of the concept
Face validity
Refers to expert verification that the instrument measures correctly
Construct validity
Test measures a theoretical construct or trait
Divergent validity
Differentiates one construct from others that are similar
Reliability
Measure consistently
Stability
Homogeneity or internal consistency, equivalence
Reliability coefficient
Measures the relationship between error variance, true variance, and the observed score
Parallel reliability
Using two comparable forms of an instrument to test the same concept
Integrator reliability
Two or more investigators are collecting data
Kappa level of 0.9 is good
A measure can be considered to be reliable while…
Unlikely found to be valid
Error variance
The differences in scores between actual measurement of behavioral changes and thee extent of variability in test scores