Chapter 11: Validity and Reliability of a Research Instrument Flashcards

1
Q

Instrument

A

term used for a measurement device

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Researcher-Completed Instruments

A
  • rating scales
  • interview schedules
  • tally sheets
  • flowcharts
  • performance checklists
  • observation forms
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Subject-Completed Instruments

A
  • questionnaires
  • self-checklists
  • attitude scales
  • personality inventories
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Usability

A

ease with which an instrument can be administered, interpreted, and scored

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Usability Considerations

A
  • how long will it take to administer?
  • are the directions clear?
  • how easy is it to score?
  • do equivalent forms exist?
  • have any problems been reported by others who used it?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Validity and Reliability in Research

A
  • whether research instrument achieves truthfulness of the results
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Validity

A
  • extent to which an instrument measures what it is supposed to measure and performance
  • almost impossible for an instrument to be 100% valid
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Internal Validity

A

the extent to which you are able to say that no other variables except that ones you are studying caused the result

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

External Validity

A

the extent to which results of a study can be generalized to the world at large

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Types of Validity

A
  1. face validity
  2. content validity
  3. construct validity
  4. criterion-related validity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Face Validity

A
  • least scientific method of validity as it is not quantified using statistical methods
  • direct measurement is obtained by asking people to rate validity of a test as it appears to them
  • only good if there is a reasonable agreement among the raters
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Content Validity

A
  • subjective measure

- ask whether the content of a measure covers the full domain of a concept

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Construct Validity

A
  • collection of behaviors associated in oder to create an imagine/idea invented for a research purpose
  • extent to which a test captures a specific theoretical construct or trait, and it overlaps with some of the other aspects of validity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Criterion-Related Validity: Concurrent Validity

A
  • refers to the extent to which the result of a particular test or measurement correspond to those of a previously established measurement for the same construct
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Criterion-Related Validity: Predictive Validity

A

uses the scores from the new measure to predict performance on a criterion measure administered at a later time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Reliability

A

degree to which an assessment tool produces stable results

17
Q

Test-Retest Reliability

A
  • obtained by administering the same test twice over a period of time to a group of individuals
  • scores from time 1 and time 2 can then be correlated in order to evaluate the test for stability over time
18
Q

Parallel Forms Reliability

A
  • obtained by administering different versions of an assessment tool to the same group of individuals
  • scores from the two versions can then be correlated in order to evaluate the consistency of the results across alternate versions
19
Q

Inter-Rater Reliability

A
  • measure of reliability used to assess degree to which different judges or raters agree in their assessment decisions
20
Q

Internal Consistency Reliabiliy

A

measure of reliability used to evaluate the degree to which different test items that prove the same construct produce similar results

21
Q

Factors Affecting the Reliability of a Research Instrument

A
  • wording of the questions
  • physical setting
  • respondent’s mood
  • interviewer’s mood
  • nature of interaction
  • regression effect of an instrument
22
Q

Validity and Reliability in Qualitative Research: Traditional Criteria

A
  • internal validity
  • external validity
  • reliability
  • objectivity
23
Q

Validity and Reliability in Qualitative Research: Alternative Criteria

A
  • credibility
  • transferability
  • dependability
  • confirmability
24
Q

Qualitative Research: Credibility

A
  • involves establishing that the results of qualitative research are credible or believable from the perspective of the participant in research
  • from this perspective the purpose of qualitative research is to describe or understand the phenomena
25
Q

Qualitative Research: Transferability

A
  • degree to which the results of qualitative research can be generalized or transferred to other contexts or settings
  • from a qualitative perspective transferability is primarily the responsibility of the one doing the generalizing
26
Q

Qualitative Research: Dependability

A
  • need for the researcher to account for the ever-changing context within which research occurs
  • researcher is responsible for describing the changes that occur in the setting and how these changes affected the way the research approached the study
27
Q

Qualitative Research: Confirmability

A

degree to which the results could be confirmed or corroborated by others