Validity, Errors, Reliability Flashcards
External Validity
- study design
- extent to which results from a study can be extended to a broader population
Internal Validity
- adequacy of measurement
- how well does it measure what is is intended to measure
Increasing External Validity
- subject selection ⇒ avoid confounding factors, random selection, recruitment methods
- subject number ⇒ sufficient statistical power to detect differences
- still need quality control measures and judgement about degree to which results can be extrapolated
Increase Internal Validity
- study design
- data collection techniques ⇒ statistical analysis procedures
Components of Internal Validity
- face validity
- content validity
- criterion validity
- reliability
Face Validity
- judgement (often by experts) that the tool measures what it is supposed to, expert opinion, can be objective
Content Validity
- are all components of the construct measured by the methods being used?
Criterion Validity
- how well the tool performs compared with another tool such as a gold standard
Reliability / Precision
- degree to which related measures give the same value
- almost impossible to achieve at the individual level
Reliability is affect by….
- true variation in measurement
- random error / random variation
- systemic errors
Safeguards to ensure reproducibility
- standardized protocols
- calibrate instruments
- suitable protocols for situation
- test-retest and take average
Ensure Internal Validity, consider:
- reliability (standardized protocols, calibration, suitable protocols, test-retest)
- get expert opinion on method (face validity)
- conceptualize content of contract and choose methods accordingly (content validity) look at literature
- compare between and among methods (criterion)
Measurement errors (2)
Random Systematic (bias)
Random Error
- unpredictable error
- affects reproducibility (precision) of method
- caused by true variation + measurement error
- reduces statistical power of studies
- variation but the average is about accurate (more error = weaker association)
Reduce Random Error
- standard operating procedures
- train all examiners and maintain standardized measurement techniques
- select carefully and standardize all instruments
- refine and standardize questionnaires
- control confounding factors via same training, same procedures, same technique and instrument
Systemic Error (bias)
- measurements consistently depart from the true value
- bias either negative or positive
- always under/overestimate due to technique or tool
ex - under/overestimating daily intake - average will not be true value
ex - if scale isn’t calibrated
Types of Systemic Error (3)
⇒ person-specific bias individual level - someone is reporting their dietary intake or weight in a bias way
⇒ constant additive error - something that affects the entire population (tool)
⇒ intake-relative bias - differences when all underestimating but some more than others
confounding factor
- special type of bias (influences validity)
- a characteristic or variable that is distributed differently in the study and control groups, and affects the outcome being assessed
ways to control confounding factors
- randomization, stratification, recruitment, restriction, matching
Measurement Error Sources
- respondent error ⇒ compliance, random or systematic
- interviewer bias ⇒ work harder on Monday, may not ‘click’ with some subjects, promote “healthy eating”
- respondent memory lapse ⇒ use memory aids, longer time period, minor, condiments etc not remembered
- incorrect estimation of portion size ⇒ asking someone average rather than estimating exact
Additional sources of error
- account for supplement use?
- bioavailability of nutrients?
- processing losses/gains?
- foods not present in databases
- honest vs desirable responses
- food served vs food consumed
Coding and Computer Errors
- poor description of foods
- inaccurate conversions from estimates
- may not have appropriate tables for food composition data
- systematic mistakes when entering data