Week 2 (Validity) Flashcards
Why is validity the most crucial?
It is the meaning of the test and test scores. Goes back to the purpose of the test.
Definition of validity
“The degree to which evidence and theory support the interpretation of test scores for proposed uses of tests”
The degree to which a test measures what it intends to measure
Validity is a process stemming from accumulated evidence (process continues over time)
Validity nuances (extra descriptors)
Validity is not a property of a test itself but of test use and interpretation.
Validity often measured indirectly through inferences
Validity based on developers understanding of what they are measuring (the construct) not the construct itself. So developers opinion has strong influence on the process of test validation
Who is responsible for validity
Test developer
Process of valid ur steps (3)
Identify and describe the construct that forms the basis of the test (underlying theory? Underlying constructs?)
Create test questions to measure that construct
Show that process has been successful by pursuing validation process and documenting results
What is a construct
What we set out to measure. Anything created by the human mind that is not directly observable (aka latent variable)
Constructs differ by breadth/complexity, potential applicability, and degree of abstraction
How do we identify and describe a construct?
Theory, previous research, and behavioral observation
Test construction and item analysis
Qualitative item analysis: (test the test construction)
Are items appropriate to the purpose of the test and the population for whom the test is designed?
Are items clearly expressed?
Are items grammatically correct? Adhering to writing rules?
Are items relatively free of bias or offensive portrayals of subgroups?
Quantitative item analysis: (tests if items are measuring the construct)
Item response theory
Item discrimination (item difficulty and fairness)
Difficulties is measuring validity
What you are trying to measure is often abstract and based upon theory, may have various definitions and therefore would need different approaches, may only be accessed by indirect means
Ways to establish validity (3)
Content related, criterion related (concurrent and predictive), construct related
Content related validity
“The representativeness and relevance of the instrument to the construct being measured”
Items are based on conceptualization of what the test developers set out to measure
Measurement is often non-statistical and use of judgement by subject matter experts is used
How well does content of instrument itself relate to construct being measured
Face validity!!!
Criterion related validity
“Comparing test scores with some sort of performance on an outside measure”
Outside measures need to have theoretical relation to the variable being assessed (IQ test and GPA for example)
Criterion related validity concurrent
Measures taken at the same time as the test
Criterion related validity predictive
Outside measures that were taken some time after the test scores were derived
Interpretations correlations in criterion related validity
Correlations (indicated by coefficient r) between test in development and criterion measure usually described by a correlation coefficient (-1 to +1)