Lecture 10B: Assessment of Measurement Reliability Flashcards

1
Q

What is measurement reliability?

A

A reliable instrument performs with predictable consistency under set conditions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the four types of measurement reliability?

A
  • Test-retest
  • Intra-rater
  • Inter-rater
  • Internal consistency
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Define test-retest reliability.

A

Examine the same instrument in the same group of individuals at two (or more) different times.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What does intra-rater reliability concern?

A

The stability of measurements by one individual across one or more trials.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is inter-rater reliability?

A

The consistency of findings between raters.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What makes a high reliability?

A
  • High degree of association
  • Agreement of scores
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the intraclass correlation coefficient (ICC)?

A

A statistic used to assess reliability, considering correlation and agreement of scores.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the magnitude range of ICC?

A

0.00 to 1.00; values closer to 1.00 indicate higher reliability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is considered acceptable reliability for ICC?

A

≥0.75

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What does Model 1 of ICC represent?

A

Each subject is assessed by a different set of randomly selected raters.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is Model 2 of ICC also known as?

A

Two-way random effects model: Each subject is assessed by each rater, results can be generalized

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What characterizes Model 3 of ICC?

A

Two-way mixed effects model: Each subject is assessed by each rater; raters are not representative of a larger population.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the difference between Form 1 and Form k in ICC?

A

Form 1 uses a single measurement; Form k uses the average of several measurements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is Cronbach’s alpha used to assess?

A

Internal consistency.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What does internal consistency measure?

A

the extent to which the items of a scale
measure various aspects of the same attribute (e.g., balance: static, dynamic, sitting, standing, etc.)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What does a Cronbach’s alpha value >0.8 indicate?

A

Excellent internal consistency.

17
Q

What does a Cronbach’s alpha value <0.7 indicate?

A

Poor internal consistency.

18
Q

What does a Cronbach’s alpha value >0.9 indicate?

A

Too high: all items are measuring exactly the
same thing (can only use one item to measure what you want to measure instead?)

19
Q

What is percent agreement?

A

A measure of how often raters agree on scores.

Number of agreements divided by Number of possible agreements

20
Q

What does the Kappa statistic account for?

A

Agreement due to chance.
Ratio of observed non-chance agreement to
possible non-chance agreement.

21
Q

What is the interpretation of a Kappa value >80%?

A

Excellent agreement.

22
Q

What is the purpose of reliability analyses for categorical data?

A

To assess inter-rater reliability.

23
Q

What is the formula for percent agreement (Po)?

A

Po = Number of agreements / Number of possible agreements.

24
Q

What is the effect of deleting an item with high correlation in Cronbach’s alpha?

A

It may indicate redundancy or improve internal consistency.

25
Q

What should be the action if deleting an item dramatically increases Cronbach’s alpha?

A

Consider removing or modifying that item.

26
Q

Summarize reliability

27
Q

What action is recommended if a scale has an item with very high correlation and affects Cronbach’s alpha?

A

Remove/modify that item.