Threats to Internal Validity Flashcards
What are the three major threats to internal validity?
chance
bias
confounding
What is chance?
random error
-inherent in all measurements
less random error=good
more random error=bad
What is the statistical value we use to report chance?
p-value
Describe the p-value and its relevance to chance.
probability representing the strength of evidence to support the null hypothesis
-p value > 0.05: supports null hypothesis
-p value < 0.05: reject the null hypothesis
probability the results are due to chance than a real treatment effect
Differentiate between p-value of >0.05 and <0.05.
> 0.05: supports the null hypothesis
-no statistically significant difference between the groups results
<0.05: reject the null hypothesis
-there is statistically significant difference between the groups results
How do we deal with chance?
increase the sample size
recognize the extent (through stats) and interpret the results accordingly
True or false: statistical significance=clinical significance
false
What is confounding?
when some factor(s) other than the intervention or exposure under study, influence the outcome
True or false: we dont always measure or even recognize a confounder
true
How do we deal with confounding?
randomization (gold standard)
-ensure groups are similar in all aspects (known and unknown factors)
stratification
matching (often done by sex and age)
statistical models
Why is the table of baseline characteristics so important?
gives insight if the groups are similar and if the results will be applicable to other groups
What is bias?
problems with the way a study was designed, conducted, or analyzed that leads to incorrect results or conclusions
-usually because the two groups were treated differently somehow
Differentiate selection bias and information bias.
selection bias:
-systematic error (or differences in how the study subjects were selected or who participated
information bias:
-problems with measuring, collecting or classifying information
What does selection bias primarily affect?
external validity
What are some examples of selection bias?
self-selection/volunteer bias
-people who volunteer are different from those who dont
healthy worker (adherer) bias
-employed individuals are usually healthier
attrition bias (lost to follow-up)
-a concern if there are differences between those who are lost and those who werent
Differentiate between outcome errors and exposure errors.
outcome errors:
-RCT and observational studies
-problems with measuring tools
-problems with actual measurements
exposure errors:
-observational studies
-problems with how subjects are categorized
-problems with measuring tools
When is information bias a concern?
when the likelihood of being misclassified is unequal between groups
What are some examples of information bias?
recall bias
-individuals remember things differently
-subjects with the outcome (esp if negative) are more likely to remember the intervention
interviewer bias
-interviewer asks about exposure/outcome differently
-leading, probing or influencing questions
-multiple interviewers
surveillance/detection bias
-one study group followed more closely than the other
How do we minimize bias?
recognize and acknowledge it
-can never completely eliminate it, try minimize it
clear definition of study and sample population
treat all groups the same except for the intervention
standardized measurements
collect information from all subjects in the same way
BLINDING
What is publication bias?
authors and journals tend to publish positive findings (esp with drug trials)
-may lead readers to believe there is association when its not the case