Reliability and Validity (Chap 4) Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Reliability (def)

A
  • refers to the consistency, stability, or equivalence of a measurement. (book)
  • stability and consistency or measurement (lecture)
  • in order to do this need to compare measurements
  • reliability coefficient of around .70 are professionally acceptable
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Major types of reliability (four)

A
  1. test-retest
  2. equivalent-form
  3. internal-consistency
  4. inter-rater
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Test-retest reliability

A
  • reveals the stability of test scores upon repeated applications of the test
  • measure something at two different times and compare the scores
  • coefficient of stability - reflects the stability of the test over time
  • **BEST test of reliability
  • strengths:
    • most direct method
  • weaknesses:
    • effort and time
    • learning / change (ex: interview)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Equivalent-form reliability

A
  • reveals the equivalence of test scores between two versions or forms of the test
  • two forms of a test to measure the same attribute. both tests given to the same group.
  • the two scores for each person are correlated giving the coefficient of equivalence
  • **2nd best test of reliability
  • strengths:
    • one time / no learning
  • weaknesses:
    • are the tests actually equal?
    • one step away from the definition of reliability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Internal-consistency Reliability

A
  • reveals the homogeneity of the items comprising a test
  • two types of internal-consistency reliability:
    1. split-half reliability
    2. Cronbach’s Alpha
  • if the test is homogeneous (the item content is similar), it will have a high internal consistency reliability coefficient
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Split-half reliability (from Internal-consistency Reliability)

A
  • split the test in half and correlate the two halves

- correlate even questions to odd questions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Cronbach’s Alpha (from Internal-consistency Reliability)

A
  • correlate each item with every other item

* all items must be addressing the same concept

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Inter-rater reliability

A
  • reveals the degree of agreement among the assessments provided by two or more raters
  • degree of correspondence between judgments or scores assigned by different raters
  • multiple observers
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Predictor vs Criterion

A
  • a predictor is used to forecast criterion
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Validity (def)

A
  • accuracy of measurement
  • a standard for evaluating tests that refers to the accuracy of appropriateness of drawing inferences from test scores
  • the test’s appropriateness for predicting or drawing inferences about criteria
  • validity is not inherent in a measure or predictor
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

3 main ways to asses validity

A
  1. content validity
  2. criterion-related validity
  3. construct validity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Construct Validity

A
  • the degree to which a test is an accurate and faithful measure of the construct it purports to measure
  • construct is an unobservable variable
  • there should be a high correlation between the scores from our new test of intelligence and the existing measures of intelligence
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Criterion-related Validity

A
  • refers to how much a predictor relates to a criterion
  • degree to which a test forecasts or is statistically related to a criterion
  • statistical
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Concurrent criterion-related validity

A
  • concerned with how well a predictor can predict a criterion at the same time, or concurrently
  • range restriction
  • there is no time interval between collecting the predictor and criterion data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Predictive criterion-related validity

A
  • collect predictor info and use it to forecast future criterion performance
  • ex: asses applicants, hire everyone, asses performance, correlate test with performance
  • more expensive and time consuming
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Validity coefficient

A
  • statistical index that reveals the degree of association between two variables
  • correlation coefficient between the predictor and the criterion
  • .3-.4 desired
17
Q

squaring the validity coefficient

A
  • proportion of variance in criterion accounted for by predictor
  • .4 X .4 = .16 = 16%
18
Q

Content validity

A
  • degree to which subject matter experts agree that the items in a test are a representative sample of the domain of knowledge the test purports to measure
  • limited mainly to psychological tests but may extend to interviews
  • subjective
19
Q

Face validity

A
  • the appearance that items in a test are appropriate for the intended use of the test by the individuals who take the test
20
Q

Approaches for showing construct validity

A
  1. specifically and accurately define the construct
  2. logical analysis
  3. content validity analysis
  4. convergent and divergent validity analysis
  5. accumulated evidence