Validity Flashcards

1
Q

A judgment or estimate of how well a test measures what it purports to
measure in a particular context.

A

Validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

The agreement between a test score or measure and the
quality it is believed to measure

A

Validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

The process of gathering and evaluating evidence about
validity.

A

Validation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

require professional time and know-how, and
they may be costly.

A

Local validation studies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

are absolutely necessary when the
test user plans to alter in some way the format, instructions, language, or
content of the test.

A

Local validation studies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

The degree to which all accumulated evidence supports the intended
interpretation of test scores for the intended purpose.

A

Validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

The process of gathering and evaluating evidence about
validity.

A

Validation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

require professional time and know-how, and
they may be costly.

A

Local Validity Studies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

This measure of validity is based on an evaluation of
the subjects, topics, or content covered by the items in the test

A

Content validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

This measure of validity is obtained by
evaluating the relationship of scores obtained on the test to scores on
other tests or measures.

A

Criterion related Validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

This measure of validity is arrived at by executing a
comprehensive analysis of:
• How scores on the test relate to other test scores and measures.
• How scores on the test can be understood within some theoretical
framework for understanding the construct that the test was
designed to measure.

A

Construct Validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Validity best answers which of the following questions?
A. How consistent is the test over time?
B. How accurate are the results of the test?
C. Does the test measure what it is supposed to measure?
D. What is the variability of the test?

A

C

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

be defined as the agreement between a test score or
measure and the quality it is believed to measure. Validity is
sometimes defined as the answer to the question, “Does the test
measure what it is supposed to measure?”

A

Validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Aspects of Validity

A

Face Validity
- Content Validity
- Criterion-Related Validity (Concurrent and Predictive)
- Construct Validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

A judgment concerning how relevant the test items appear to be.

A

Face Validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

is not a statistical or numerical
measure; rather, it taps whether a test “feels like” it is a reasonable
measure of its associated criterion

A

Face Validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

it taps whether a test “feels like” it is a reasonable
measure of its associated criterion

A

Face Validity

18
Q

A judgment of how adequately a test samples behavior
representative of the universe of behavior that the test was designed to sample.

A

Content Validity

19
Q

A plan regarding the types of information to be covered by
the items, the number of items tapping each area of coverage, the
organization of the items in the test

A

Test Blueprint

20
Q

The content validity of a test varies across

A

Culture and Time

21
Q

Developers must show evidence that the domain was systematically analyzed and concepts are covered in correct proportion

A

Content Validity

22
Q

Four-step process:

A

Step 1 - Survey the domain – Defining the performance domain of interest
• Step 2 - Content of test matches the domain – Selecting a panel of
qualified experts in the content domain
• Step 3 - Specific test items match the content – Providing a structured
framework (instruction) for the process of matching item (Question) to the
performance domain (Answer)
• Step 4 - Analyze relative importance of each objective (weight) –
Collecting and summarizing the data form the matching process.

23
Q

Not a real type of content validity
– A quick look at “face” value of questions
– Sometimes questions may not seem to measure the content,
but do
• How might you show content validity for an instrument that
measures depression?

A

Content Validity

24
Q

evidence tells us just how well a test corresponds
with a particular criterion

A

Criterion Validity

25
Q

A judgment of how adequately a test score
can be used to infer an individual’s most probable standing on some
measure of interest (i.e., the criterion).

A

Criterion related validity

26
Q

An index of the degree to which a test score is related to some criterion measure obtained at the same time

A

Concurrent Validity

27
Q

Does the instrument relate to another criterion now

A

Concurrent Validity

28
Q

An index of the degree to which a test score predicts some criterion, measure.

A

Predictive Validity

29
Q

Does the instrument relate to another criterion in the future?

A

Predictive Validity

30
Q

is the standard against which a test or a test score is evaluated.

A

Criterion

31
Q

Characteristic of Criterion

A

Relevant to the matter at hand.
• Valid for the purpose for which it is being used.
• Uncontaminated, meaning it is not based on predictor measures.

32
Q

evidence comes from assessing the
simultaneous relationship between a test and criterion

A

Concurrent Validity

33
Q

correlation coefficient that provides a measure of
the relationship between test scores and scores on the criterion measure

A

Vadility Coefficient

34
Q

Is the relationship between variation in the criterion and our knowledge of a test score

A

Validity coefficient

35
Q

The degree to which an additional predictor explains something about the criterion measure that is not explained by predictors already in use.

A

Incremental validity

36
Q

Judgment about the appropriateness of inferences drawn from test
scores regarding individual standings on a construct.

A

Construct Validity

37
Q

Scores on the test undergoing construct validation tend to correlate highly in the predicted direction with scores on older, more established tests designed to measure the same (or a similar)
construct.

A

Convergent Validity

38
Q

Validity coefficient showing little relationship between test scores and/or other variables with which scores on the test should not theoretically be correlated.

A

Discriminant Validity

39
Q

Class of mathematical procedures designed to identify specific variables on which people may differ.

A

Factor Analysis

40
Q

When two things converge, they are “coming together” in a
meaningful way

A

Convergent Validity

41
Q

Defining the validity of a test is futile if it is not also reliable

T or F

A

T