Exam 2 pt 2 Flashcards

1
Q

translating concepts of interest in a study into something observable & measurable

A

Operationalizing a Concept

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

a method to measure (quantify) a concept or variable(s) of interest

A

instrument

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

(survey) via mail, in-person, email, phone

A

questionnaire

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

systematic coding, specified time

A

observation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

researcher is present

A

interview scale

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

existing records

A

document review

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

instruments used in phenomenology

A

in depth interviews
diaries
artwork

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

instruments used in grounded theory

A

observations
open ended questions(interview)
individuals or small groups

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

instruments used in ethnography

A
observation
open ended questions (interview)
diagrams
documents
photographs
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

instruments used in historical

A
open ended questions 
interviews
documents
photographs
artifacts
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

examines causes of certain effects

A

experimental/clinical trial

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

examines why certain effects occur

A

quasi experimental

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

examines relationships among variables

A

correlational

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

answers what questions

describes frequency of occurrence

A

exploratory/descriptive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

unsystematic error such as a transient state in the subject, the context of a study, or in the administration of the instrument (reliability)

A

random error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

altering the measurement of true responses in some way consistently (validity)

A

systematic error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Measurement Error theoretical formula

A

observed score= true score+error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

what is reliability concerned with

A

the repeatability or consistency with which an instrument measures the concept of interest

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

three types of reliability

A

stability, equivalence, internal consistency

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

(test/retest or intra-rater)

A

stability

21
Q

(inter-rater) – alternate form

A

equivalence

22
Q

homogeneity – split-half reliability, item-to-total correlation, Kuder-Richardson coefficient, or Chronbach’s coefficient alpha

A

internal consistency

23
Q

how is reliability reported

A

as a reliability coefficient

24
Q

how are reliability coefficients (r) expressed

A

are expressed as positive correlation coefficients ranging from 0 to +1

25
Q

how do you interpret r

A

r=0.80 or higher is acceptable for existing instruments

R=0.70 or higher is acceptable for a newly developed instrument

26
Q

is a necessary, though insufficient, condition for validity

A

high reliability

27
Q

Instruments may have good reliability even if they

A

are not valid

they don’t measure what they are supposed to measure

28
Q

is concerned with the consistency of repeated measures under the same circumstances

A

stability

29
Q

what is stability also called

A

test-retest reliability or intra-rater reliability

30
Q

what is equivalence focused on comparing

A
two versions of the same instrument (alternate form reliability) 
two observers (inter-rater reliability) measuring the same event; consistency in raters using discrete categories
31
Q

a reliability coefficient of what indicates good agreement

A

.75 or greater

32
Q

addresses the correlation of various

items within a single instrument or homogeneity; all items are measuring the same concept

A

internal consistency

33
Q

internal consistency is also called what

A

split-half, item-to-total correlation,

Kuder-Richardson coefficient, or Chronbach’s coefficient alpha

34
Q

divide items on instrument in half to make two versions & use Spearman-Brown formula to compare halves

A

split half reliability

35
Q

each item on instrument is correlated with the total score; strong items have high correlations with the total score

A

item to total correlation

36
Q

divide the instrument with dichotomous (yes/no) or ordinal responses in half every possible way

A

Kuder-Richardson or KR-20

37
Q

divide the instrument with interval or ratio responses in half every possible way

A

Cronbach’s alpha (coefficient alpha)

38
Q

the extent to which an instrument accurately measures what it is supposed to measure

A

validity

39
Q

types of validity

A

Content or face validity
Construct validity
Criterion-related validity

40
Q

how well the instrument compares to an older instrument (concurrent validity) or is able to predict future events, behaviors, or outcomes (predictive validity)

A

criterion related validity

41
Q

how representative is the instrument of the concept(s)

A

content validity

-determined by a panel of experts

42
Q

extent to which the instrument performs theoretically; how well does the instrument measure a concept

A

construct validity

43
Q

ways to determine construct validity

A

Hypothesis testing
Convergent or divergent or multi-trait-multi-method testing
Known group(s) testing
Factor analysis

44
Q

use two or more instruments to measure the same concept (example: two pain scales)

A

convergent testing

45
Q

compare scores from two or more instruments that measure opposite concepts (example: depression vs. happiness)

A

divergent testing

46
Q

Administer instrument to subjects known to be high or low on the characteristic being measured

A

known groups (construct validity)

47
Q

Use complex statistical analysis to identify multiple dimensions of a concept

A

factor analysis (construct validity)

48
Q

what are the data collection methods

A

interviews
observation
text
sampling

49
Q

what is crucial for qualitative data collection

A

trustworthiness