Validity, Errors, Reliability Flashcards

1
Q

External Validity

A
  • study design

- extent to which results from a study can be extended to a broader population

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Internal Validity

A
  • adequacy of measurement

- how well does it measure what is is intended to measure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Increasing External Validity

A
  • subject selection ⇒ avoid confounding factors, random selection, recruitment methods
  • subject number ⇒ sufficient statistical power to detect differences
  • still need quality control measures and judgement about degree to which results can be extrapolated
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Increase Internal Validity

A
  • study design

- data collection techniques ⇒ statistical analysis procedures

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Components of Internal Validity

A
  • face validity
  • content validity
  • criterion validity
  • reliability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Face Validity

A
  • judgement (often by experts) that the tool measures what it is supposed to, expert opinion, can be objective
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Content Validity

A
  • are all components of the construct measured by the methods being used?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Criterion Validity

A
  • how well the tool performs compared with another tool such as a gold standard
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Reliability / Precision

A
  • degree to which related measures give the same value

- almost impossible to achieve at the individual level

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Reliability is affect by….

A
  • true variation in measurement
  • random error / random variation
  • systemic errors
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Safeguards to ensure reproducibility

A
  • standardized protocols
  • calibrate instruments
  • suitable protocols for situation
  • test-retest and take average
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Ensure Internal Validity, consider:

A
  • reliability (standardized protocols, calibration, suitable protocols, test-retest)
  • get expert opinion on method (face validity)
  • conceptualize content of contract and choose methods accordingly (content validity) look at literature
  • compare between and among methods (criterion)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Measurement errors (2)

A
Random
Systematic (bias)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Random Error

A
  • unpredictable error
  • affects reproducibility (precision) of method
  • caused by true variation + measurement error
  • reduces statistical power of studies
  • variation but the average is about accurate (more error = weaker association)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Reduce Random Error

A
  • standard operating procedures
  • train all examiners and maintain standardized measurement techniques
  • select carefully and standardize all instruments
  • refine and standardize questionnaires
  • control confounding factors via same training, same procedures, same technique and instrument
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Systemic Error (bias)

A
  • measurements consistently depart from the true value
  • bias either negative or positive
  • always under/overestimate due to technique or tool

ex - under/overestimating daily intake - average will not be true value
ex - if scale isn’t calibrated

17
Q

Types of Systemic Error (3)

A

⇒ person-specific bias individual level - someone is reporting their dietary intake or weight in a bias way
⇒ constant additive error - something that affects the entire population (tool)
⇒ intake-relative bias - differences when all underestimating but some more than others

18
Q

confounding factor

A
  • special type of bias (influences validity)
  • a characteristic or variable that is distributed differently in the study and control groups, and affects the outcome being assessed
19
Q

ways to control confounding factors

A
  • randomization, stratification, recruitment, restriction, matching
20
Q

Measurement Error Sources

A
  • respondent error ⇒ compliance, random or systematic
  • interviewer bias ⇒ work harder on Monday, may not ‘click’ with some subjects, promote “healthy eating”
  • respondent memory lapse ⇒ use memory aids, longer time period, minor, condiments etc not remembered
  • incorrect estimation of portion size ⇒ asking someone average rather than estimating exact
21
Q

Additional sources of error

A
  • account for supplement use?
  • bioavailability of nutrients?
  • processing losses/gains?
  • foods not present in databases
  • honest vs desirable responses
  • food served vs food consumed
22
Q

Coding and Computer Errors

A
  • poor description of foods
  • inaccurate conversions from estimates
  • may not have appropriate tables for food composition data
  • systematic mistakes when entering data