week 9 Flashcards

1
Q

What are the two types of surveys?

A
  1. Ad hoc

2. Onnibus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does Ad hoc mean?

A

single purpose

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does Omnibus mean?

A

Multipurpose (the topics must be compatible)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are two ways in which surveys can be self-completed?

A
  1. Postal

2. Internet

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are three ways that surveys can be administered?

A
  1. Interview
  2. Telephone
  3. Face to face
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the three bias’ that are involved with self-survey?

A
  1. memory
  2. knowledge
  3. social desirability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are the four limitations regarding surveys?

A
  1. bias of self report
  2. poor quality information
  3. sample bias (low response rate)
  4. interviewer bias
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the 5 strengths of surveys?

A
  1. low cost
  2. confidential
  3. standardised
  4. generate large amounts of data
  5. amendable to multivariate analysis
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the first step in questionnaire development and design?

A

Conducting the preliminary work. This includes consulting with experts in the field and conducting qualitative studies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the second step in questionnaire development and design?

A

Determining:

  1. hypothesis/research questions
  2. appropriate theoretical framework
  3. variables to be studies
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the third step in questionnaire development and design?

A

Compiling research questions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are the 6 aspects to think about when compiling research questions?

A
  1. comprehensive -define terms
  2. short
  3. unambiguous
  4. not ‘leading’
  5. not negative
  6. appropriate -filter questions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the fourth step in questionnaire development and design?

A

Pilot a draft questionaire

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the fifth step in questionnaire development and design?

A

Revise the questionaire

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are the 6 types of responses involved in a questionaire?

A
  1. yes/no
  2. multiple choice
  3. numeric open end
  4. text open end
  5. likert scales
  6. visual analogue scales
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Give an example of a numeric open end question?

A

How much did you spend on groceries this week? (number answer)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Give an example of a text open end question on a questionnaire

A

How can our company improve its working conditions?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What are two types of visual analogue scales?

A
  1. numeric rating scales

2. visual analogue scale (no numbers just mark on a line)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

In survey sampling, what does population refer to?

A

All possible cases

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What does sample mean in regard to survey sampling, and what are three ways to describe it?

A

A subset of a population

  1. derived from the population
  2. a microcosm of the population
  3. important characteristics distributed similarly to the population
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What are two ways in which sampling reduces random error?

A
  1. sample size- large enough so there is power to draw conclusions or uphold the null hypothesis
  2. sample representativeness -conclusions can be generalised to the population
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What does random mean in probability sampling?

A

all members of a population have equal chance of being selected

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What is the systematic way in probability sampling?

A

Where every “nth” person is selected

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What is cluster when it comes to probability sampling?

A

A naturally occurring unit including a range of characteristics (schools, hospitals, retail outlets, etc)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
What is stratified random sampling in relation to probability sampling?
Divide population into subgroups from each of which a random sample is drawn
26
Finish the sentence: a random sample may not include
the full range of relevant characteristics
27
What are two types of non-probability sampling?
convenience - enlist as people appear | snowball - respondents nominate other to take part
28
What is one type of representative sampling (e.g in terms of demographic characteristics)?
Quota sampling - enlist a given number within a category- e.g make proportion in each age group same as or similar to that of the general population.
29
What is representative sampling prone to?
Selection bias
30
What are three ways to overcome measurement erroe?
1. response rate 2. reliability 3. validity
31
Define response rate?
Number who respond divided by the number eligible | -over-sampling advised to compensate for non-response
32
What does reliability mean in terms of questionnaires/surveys?
How reproducible the results arising from the questionnaire/scale
33
What is test-retest in testing reliability in surveys/questionnaires?
The same respondents repeat the survey on two or more occasions
34
What is alternative form?
Same respondents completing two versions of the questionaire
35
What is split-half in testing reliability?
Two scales measuring thee same factors
36
What is the correlation that reflects an instruments reliability?
r = >0.70
37
What does validity mean in regard to a questionnaire/survey? (2)
1. how well the questionnaire measures what it claims to measure, 2. the degree to which your measure assesses what it is supposed to measure
38
What are the four types of validity that may need to be tested in a questionaire?
1. face validity 2. content validity 3. construct validity 4. criterion related validity
39
What are the two types of criterion related validity?
1. Predictive validity | 2. concurrent validity
40
What does face validity mean?
How appropriate the test items appear; do the items appear to be measuring what you want 'on the face of it'?
41
How is face validity assessed when it is assessed informally?
Using the big 5 extraversion item
42
Define content validity
Extent to which the specification of the test actually reflects the purpose of the test
43
In testing the content validity, one way is to make sure the items are appropriate. The items need to be judged by experts in the field to be
1. inclusive | 2. relevant
44
Using the example of Beck's Depression Inventory (BDI), show how content validity would be assessed: (3)
1. Does Beck's Depression inventory really measure depression? 2. Does it cover all aspects of depression such as behaviour, cognitions, emotions? 3. Items must assess every aspect fo a construct which is evidenced in the literature, e.g self-regulation of emotion should cover both positive and negative emotions
45
What is concurrent validity?
The correlation between a new test and existing tests which claim to measure the same construct (e.g revised or shorter version of an existing test)
46
What are two examples of tests in which concurrent validity may be tested with the other, that claim that the same construct is measured?
1. profile mood states and positive and negative affect scale 2. dutch eating behaviour questionnaire and 3 factor eating behaviour questionnaire
47
What is predictive testing in concurrent validity look like?
One test administered before the other
48
what is concurrent testing in concurrent validity look like?
Tests given at the same time
49
define predictive validity?
Measure of how well one variable, or set of variables, predicts an outcome based on information from other variables
50
How is predictive validity measured?
By the correlation between the test and the future even E.g, does disinhibition scale (measure of the tendency to overeat) predict weight status?
51
Define construct validity
how well the instrument performs in practical situations
52
What does convergent relate to in construct validity?
The degree to which produces similar results to other measured and methods researching the same variables
53
what does divergent mean in terms of construct validity?
The degree to which the instrument can distinguish between groups known to vary in relation to the variables being researched
54
Define criterion (or concrete) validity
Criterion or concrete validity is the extent to which the measures are demonstrably related to concrete criteria in the "real" world.
55
What is one example of criterion related validity?
Does the BDI distinguish between clinical and non clinical and non clinical depression and/or between depressed and non-depressed individuals?
56
What is another example of criterion validity?
Disinhibition (tendency to overeat) related to body weight, do those with a higher body weight have a higher disinhibition.
57
What does predictive mean in regard to criterion validity?
Ability to predict outcomes influenced by the variables researched
58
What does concurrent mean in regard to criterion validity?
How well the items compare with other ways of measuring the same variable
59
Can a measure be reliable and not valid?
Yes (e.g intelligence in children and shoe size)
60
If a measure is valid, what does this mean in terms of reliability?
If a measure is valid, it is usually reliable.
61
What is the first way to describe factor analysis?
1. used to analyse interrelationships among a large number of variables and to explain these variables in terms of their common underlying dimensions (factors)
62
What is the second way to describe factor analyis?
A way of condensing the information contained in a number of original variables into a smaller set of dimensions (factors) with a minimum loss of information
63
What does factor analysis determine?
If the psychometric tool is measuring the same thing in e.g different populations, time points, locations e.t.c
64
What are 5 examples of factor analysed tools?
1. the three factor eating questionnaires 2. eyseneck's personality inventory 3. geriatric depression scale 4. cattell's 16 personality factor questionnaires 5. the big 5 personality
65
Criterion related validity can test between groups. Give an example of this in relation to disinhibition and hunger
Disinhibition and hunger are related to over eating. To assess the validity of this we assess the association with body weight and satiety to see if those who are heavier have a higher overeating score