Test Construction Flashcards
Alternate Forms Reliability
coefficient of equivalence
Coefficient Alpha (Chronback’s Alpha)
method for assessing internal consistency reliability when items are not answered dichotomously
KR-20
method for assessing internal consistency reliability when items are answered dichotomously (they are either correct or not correct)
Kappa Statistic
used to measure inter-rater reliability when data are nominal or ordinal (discontinuous)
Test-Retest Reliability
yields a coefficient of stability
Spearman Brown Formula
corrects for the artificially low reliability coefficient from testing via split-half reliability (low coefficient due to shorter test length)
Size of reliability coefficient
smaller if it’s easy to get correct answer via random chance
Difficulty Index
btwn 0 (no one can answer correct) - 1 (everyone answers correct)
orthogonal factors v. oblique factors
orthogonal=uncorrelated (independent), oblique=correlated (dependent)
Concurrant Validity
type of criterion-related validity. extent to which scores related to an external criterion
Divergent (Discriminant) Validity
When scores on a measure are correlated with scores on unrelated traits (large coefficient) that is bad
cross-validation
done during test revision, associated with “shrinkage” of the criterion-related validity
external validity
researcher’s ability to generalize the results of the study to other individuals, settings, conditions
internal validity
researcher’s ability to determine whether there is a causal relationship between variables
pearson r
method of measuring inter-rater reliability, method for calculating criterion-related validity when both are on continuous scale
methods of assessing internal consistency reliability
-split half (must correct with spearman brown) -KR-20 -chronback’s alpha
4 methods of assessing reliability
inter-rater, internal consistency, alternate forms, test-retest