Week 2 (Validity) Flashcards

1
Q

Why is validity the most crucial?

A

It is the meaning of the test and test scores. Goes back to the purpose of the test.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Definition of validity

A

“The degree to which evidence and theory support the interpretation of test scores for proposed uses of tests”

The degree to which a test measures what it intends to measure

Validity is a process stemming from accumulated evidence (process continues over time)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Validity nuances (extra descriptors)

A

Validity is not a property of a test itself but of test use and interpretation.

Validity often measured indirectly through inferences

Validity based on developers understanding of what they are measuring (the construct) not the construct itself. So developers opinion has strong influence on the process of test validation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Who is responsible for validity

A

Test developer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Process of valid ur steps (3)

A

Identify and describe the construct that forms the basis of the test (underlying theory? Underlying constructs?)

Create test questions to measure that construct

Show that process has been successful by pursuing validation process and documenting results

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a construct

A

What we set out to measure. Anything created by the human mind that is not directly observable (aka latent variable)

Constructs differ by breadth/complexity, potential applicability, and degree of abstraction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How do we identify and describe a construct?

A

Theory, previous research, and behavioral observation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Test construction and item analysis

A

Qualitative item analysis: (test the test construction)

Are items appropriate to the purpose of the test and the population for whom the test is designed?

Are items clearly expressed?

Are items grammatically correct? Adhering to writing rules?

Are items relatively free of bias or offensive portrayals of subgroups?

Quantitative item analysis: (tests if items are measuring the construct)

Item response theory

Item discrimination (item difficulty and fairness)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Difficulties is measuring validity

A

What you are trying to measure is often abstract and based upon theory, may have various definitions and therefore would need different approaches, may only be accessed by indirect means

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Ways to establish validity (3)

A

Content related, criterion related (concurrent and predictive), construct related

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Content related validity

A

“The representativeness and relevance of the instrument to the construct being measured”

Items are based on conceptualization of what the test developers set out to measure

Measurement is often non-statistical and use of judgement by subject matter experts is used

How well does content of instrument itself relate to construct being measured

Face validity!!!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Criterion related validity

A

“Comparing test scores with some sort of performance on an outside measure”

Outside measures need to have theoretical relation to the variable being assessed (IQ test and GPA for example)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Criterion related validity concurrent

A

Measures taken at the same time as the test

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Criterion related validity predictive

A

Outside measures that were taken some time after the test scores were derived

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Interpretations correlations in criterion related validity

A

Correlations (indicated by coefficient r) between test in development and criterion measure usually described by a correlation coefficient (-1 to +1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

-1 means

A

Perfect Negative association

17
Q

+1 means

A

Perfect positive association

18
Q

0 means

A

No association

19
Q

Construct related validity

A

“Basic approach is to build a strong case that the test measures a theoretical construct or trait”

It is about description of trait (not predictability)

Example is factor analysis (identifies primary and secondary factors that are measured by test, used to simplify one or more tests by reducing the number of categories to a few common factors/traits)

Example is internal consistency (subtests within larger test should correlate w larger related construct)