Test Construction: Validity Flashcards

1
Q

Validity: Overview and 3 types

A

Validity = Accuracy = test measures what it is intended to measure

Content validity

Construct validity

Criterion-Related validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Content Validity

A

Test items are representative of target domain

i.e. the sample content was adequate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Content Validity vs. Face Validity

A

Content = test items adequately sampled the relevant domain

Face = test “looks like” it measures what it is intended to measure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Content Validity Ratio (CVR)

A

Ratio of subject matter expert raters to total number of raters

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Construct Validity

A

Test measures the hypothetical trait (construct) it is intended to measure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Construct Validity: Various Methods of measurement

A

Internal Consistency

Group Differences

Hypothesis Testing

Convergent and Discriminant Validity

Factorial Validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Construct Validity: Convergent and Discriminant Validity, Overview

A

Compare scores on different measures/tests that:

Aim to test the same trait (convergent)

Do not aim to test the same trait (discriminant/divergent)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Construct Validity: High Convergent Validity

A

High correlations with measures of the same related traits

e.g. BAI & GAD7

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Construct Validity: High Discriminant validity

A

Low correlations with measures of unrelated characteristics

e.g. BAI & SASSI

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Construct Validity: The Multitrait-Multimethod Matrix

A

data organization into mono/hetero traits and methods

used to evaluate convergent and discriminant validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Construct Validity: Factor Analysis

A

Factor analysis identifies the minimum number of common factors (dimensions) required to account for inter-correlations among tests/subtests/items

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Factor Analysis: Comunality

A

Variability in scores accounted for by all identified factors, aka, Common Variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Criterion-Related Validity: Purpose

A

To estimate examinee’s standing or performance on another measure

e.g.: Employee selection test (predictor) used to estimate how well an applicant will do on a measure of job performance (external criterion) after they are hired

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

The Two Forms of Criterion-Related Validity

A

Concurrent validity

Predictive validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Concurrent Validity

A

Criterion data are collected prior to/about same time as data on the predictor

Predictor is used to estimate current status
e.g. current job performance is acceptable immediately

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Predictive validity

A

Criterion data is measured sometime after the predictor has been administered

e.g. predict future job performance

17
Q

“Come together to build”

A

Converge to Construct

Construct Validity associated with convergent and discriminant validity

18
Q

Standard Error of Estimate (SEE)

A

Error in regression prediction for criterion- related validity

Used to create confidence interval around an estimated (predicted score)

19
Q

SEM vs SEE

A

SEM = used to create CI for test score

SEE = used to crate CI for predicted score

“SEE the future to confidently predict it”

20
Q

Incremental Validity: Exam Takeaways

A

Predictor determines a person is a positive or negative (score on first measure)

Criterion determines if they are a “true” or a “false” (comparison of second measure

21
Q

Criterion-related Predictive validity accuracy indexes (6)

A

Sensitivity and Specificity

Positive and Negative predictive value

Positive and Negative likelihood ratio

22
Q

Criterion-Related Validity: Sensitivity and Specificity

A

Examinees are known to have or not have the disorder of interest

Sensitivity: percent accurately identified to have the disorder by predictor

Specificity: percent accurately identified to not have the disorder by the predictor

23
Q

Criterion-Related Validity: Sensitivity and Specificity, MNEMONIC

A

“Please don’t ask me about my disorder, it’s a sensitive topic, thanks”

“I would like to specify that I do NOT have the disorder!”

24
Q

How Reliability is necessary but not sufficient for Validity

A

Low reliability prevents high content, construct or criterion-related validity

*But high reliability DOES NOT guarantee validity

25
Criterion-Related Validity: Correction for Attenuation
Estimate of what the predictor's validity coefficient would be if the predictor and/or the criteria were perfectly reliable "A 10 out of 10 is perfect prediction" [a 10 = ATTENuation]
26
Criterion-Related Validity: Criterion Contamination
When criterion scores are affected by knowledge of predictor scores
27
Construct Validity: Factorial Validity
High correlations with related factors (convergent) Low correlations with unrelated factors (discriminant)
28
Multitrait-Multimethod Combinations: | High Convergent Validity
Large Monotrait-Heteromethod CV = LMH
29
Multitrait-Multimethod Combinations: | High Discriminant Validity
Small Heterotrait-Monomethod DV = SHM
30
Construct
Abstract characteristic that cannot be observed directly, but must be inferred by observing his effects. E.g. intelligence, self-esteem, neuroticism
31
Factor Analysis: Comunality Calculation
Sum of squares of each factor loading e.g.: For Test A, Factor I = .80, Factor II = .20 Comunality = .64 (.80²) + .04 (.20²) = .68
32
Two Components of Reliability: Comunality and Specificity
Reliability = Comunality + Specificity Comunality: variability of scores due to factors in common with other tests Specificity: Unique variability not explained by factor analysis
33
Comunality as lower-limit estimate of reliability
Comunality is only one part of true score variability (other part is specificity) Therefore the reliability coefficient can never be lower than comunality