3 survey Flashcards
Survey = an investigation in which …1… is ….2…. but in which …3… is not used
- information
- systematically collected
- experimental method
What does it mean by when the information of a survey is systematically collected ?
observing, not manipulating
How may a survey be conducted ?
- face-to-faceinquiry
- self-completed questionnaires
- telephone
- postal service
- or in some other way.
Generalizability of results depend on…
Extent to which surveyed population (sample) is representative
what are prevalence studies also called ?
cross-sectional studies
4 examples of health-related surveys
1) Face-to-face structured interview & measurements
2) Self-completed non-motor questionnaire for Parkinsonism
3) ONS Coronavirus Prevalence Survey: self-swab PCR
4) Child Development Supplement: parent-reported weight/height
The purpose of surveys
- Assess prevalence of disease (cross-sectional survey)
- Measure risk/protective factors of respondent
- Measure outcomes
- Ad-hoc data collection = collect information of interest
which studies is a frequently taken form of a survey ?
prevalence
3 advantages of prevalence studies ?
- cheap and quick
- useful for healthcare planning and investigating trends over time
- useful when routine data not available
3 disadvantages of prevalence studies
- not (usually) useful for conditions with a short duration (P = I xD)
- not particularly useful for investigating causality
- sampling and data collection need care
define population
the group of people in whom we are interested in and wish to apply the results of the survey to
define Sample
group of individuals taken from larger population
A value calculated from a sample is a ….?
statistic
A statistic is always what ?
an estimate of the true underlying value in the population
Aim of sampling is to ?
generalise findings from sample to the population
sampling is used to make inferences from sample results to population,
the sample must be ..1…. of the population ?
- representative
What is a sampling frame? give an example
list of everyone in the population from whole sample taken
e.g. GP practice list, electoral register, school register, employee register
With sample size, generally the ….1…. the sample size, the better
- larger
What are the 2 different sampling methods ?
- random (probability)
- non-random (non-probability)
In random sampling:
everyone in the ….1… frame has an ..2… probability of being chosen
it’s important to achieve a …3…. sample
- sampling
- equal
- representative
Non-random sampling:
* …1… and ….2…..
* unlikely to be ….3….
* beware of …..4…. samples
- easier
- convenient
- representative
- self-selecting
name 4 random sampling methods
- simple random
- stratified
- cluster
- systematic sampling (not truly random)
what is simple random sampling ?
each population member given an identifier and numbers selected at random
what is stratified sampling ?
Divide population into strata (subgroups) and select sample from each using simple random sampling
what is cluster sampling ?
- Use natural ‘clusters’ in the population e.g. schools.
- Simple random sample of ‘clusters’ (e.g. schools).
- Study all individuals within clusters
what is systemic sampling?
Every Nth population member selected
Methodological design can be influenced by two factors - what are they?
1) Choice of instrument e.g., interview vs. self-reported questionnaire
2) Quality control = standardisation of instruments, training of the interviewers, structured questionnaires
What is the pros and cons of using an INTERVIEW as the choice of instrument?
Pros
- Allows for open questions
- Ensures questions are understood
- Can explore questions in depth
Cons
- Can be timely/costly
- Requires training
- Includes interviewer bias
What is the pros and cons of using an SELF-COMPLETED QUESTIONNAIRE as the choice of instrument?
Pros
- Quick & cheap
- Avoids interviewer bias
- Good for sensitive issues
Cons
- Relies on more closed questions = risk of being misunderstood
Name some examples of how quality control of methodological design is maintained?
1) Standardisation of instruments used
2) Training for interviewers/observers
3) Structured questionnaires
Name some ways in which questionnaire design of a study (survey) can be strengthened
1) Avoid crowding questions - ensure clear, logical flow
2) Order questions appropriately (sensitive questions later)
3) Avoid leading questions
4) Avoid ambiguity (double-barrel questions)
5) Search for validated measures in literature
6) closed vs open questions
7) pilot to test acceptability of length of questionnaire
How to avoid instrument bias when measuring instrument performance ?
assessing (measuring) how well the test, instrument or question performs; over time, in different settings, with different groups
what are the 2 key components of measurement performance ?
- validity
- repeatability (or reliability or reproducibility)
What is validity ?
how well a test measures what it is claimed to measure (the capacity of a test to give a true result)
repeatability (or reliability or reproducibility) = degree to which …?
a measurement made on one occasion agrees with the same measurement on a subsequent occasion
Why are estimates obtained from a sample ?
to make inferences about real e.g. prevalence (or real risk or real incidence etc.) in the population of interest
the error that occurs in estimating the true effect is either …1… or ….2…
- random
- systemic
what is a random error ?
a random imprecision or variable performance that is due to chance alone
what is a bias ?
a systematic error in sampling or measurement
Examples of systemic error (sources of bias) : ?
- selection bias: sampling bias, non-response bias
- information (measurement) bias: instrument bias, inter-observer bias
what is a selection bias ? error due to systematic differences in ….
in the characteristics of the groups being studied due to differences in the way they were selected
2 types of selection bias
- sampling bias -> non-representative sampling
- non-response bias -> respondents differ from non-responders
2 examples of sources of bias (systemic error)
- selection bias
- information (measurement) bias
what is information (measurement ) bias ? error due to systematic differences in ….
the measurement or classification of individuals in the groups being studied
2 types of information bias ?
- instrument bias
- inter-observer bias
Instrument bias = systematic error due to ….. ?
- inadequate design,
- calibration
- or maintenance of instruments
Inter-Observer bias = systemic error between ….. e.g. due to training
measurements of different interviewers
Measurement errors can be due to issues in …… and …..
precision, accuracy
If there is lots of variation (random error) - precision is …..1…..
If there is little random variation - measures are …2…
- poor
- precise
what is accuracy ?
how close the average measurement is to the true value
What is poor accuracy the result of ?
systemic error (bias)
How could we minimise sampling bias in surveys ?
random sampling
How could we minimise non-response bias in surveys?
Cover letter signed by the respected person - explaining the reason and relevance of the survey.
Ensure language spoken is appropriate.
Provide reminders and incentives.
How could we minimise instrument bias in surveys?
Good design of questionnaires, calibration and maintenance of equipment.
How could we minimise interviewer bias in surveys?
Training, objective criteria for outcome assessment, closed questions.
what is response rate ?
% of selected sample that take part in suvery / study
How can we check for non-response bias?
Compare the respondents’ characteristics with the non-respondents’ characteristics
If data on non-respondents’ not always available what do do ?
make external comparisons with wider population of interest
4 methods to check how representative a sample is
- response rate
- compare respondents’ characteristics with non-respondents’
- make external comparisons with wider population of interest
- present data in a table
When critiquing literature , what 3 questions should be considered broadly ?
- What are the conclusions?
- What is the strength of the evidence?
- Does the methodology give you confidence in the conclusions?
In critical appraisal if the question is
Was there a clear statement of the aims of the research?
what are 3 considerations ?
- What was the goal of the research?
- Why it was thought important
- Its relevance
In critical appraisal if the question is
Was the research design appropriate to address the aims of the research?
considerations ….?
If the researcher has justified the research design (e.g. have they discussed how they decided which method to use)?
In critical appraisal if the question is
Was the recruitment strategy appropriate to the aims of the research?
considerations ….?
- If the researcher has explained how the participants were selected?
- If they explained why the participants they selected were the most appropriate to provide access to the type of knowledge sought by the study?
- If there are any discussions around recruitment (e.g. why some people chose not to take part)?