Research Test Flashcards
most treatment decisions not based on systematic review of evidence
Archie Cochrane
The term “Evidence-Based Medicine” was introduced in
1992
Purpose of “Evidence-Based Medicine”:
Shift decision-making from “intuition, unsystematic clinical experience, and pathophysiologic rationale” to increase use of scientific, clinically relevant research
Decision-making is the bringing together of what three components?
- clinical expertise
- best research evidence
- patient values and preferences
Benefits of Evidence-Based Practice?
- avoid biases from clinical experience
- used vast amount of literature
- efficient use of resources
- improved clinical care
- builds confidence in treatment
- stop ineffective practices
- builds consistence within/across professions
- promotes inquiry, continual improvements
Downsides of EBP?
- may reduce treatment options (lack of funding for research)
- challenging to study complex situations and interventions
- concerns about undermining naturopathic philosophy (individualized treatments)
- doesn’t capture significance
- gold standard studies are expansion and don’t always exist
- reduced emphasis on professional judgement
Doing EBPs: 5 A’s
Ask, acquire, appraise, apply, assess
ASK: formulate an answerable research question
ACQUIRE: find the best available evidence
APPRAISE: critically appraise/evaluate the evidence
APPLY: apply the evidence by integrating with clinical expertise and patient’s values
5. ASSESS: evaluate performance
Essential to understand and critically evaluate research to apply it properly. Conclusions from research studies may reflect the truth
Critically appraise
Presentation in the media aimed at generating attention and interest rather than accuracy
T or F: all research is open to bias
TRUE
The scientific method:
- observation/question
- research topic area
- hypothesis
- test with experiment
- analyze data
- report conclusion
a measurement of the size and direction of the relationship between 2 or more variables.
correlation
height and weight, taller people tend to be heavier
positive correlation
mountain altitude and temperature, as you climb higher it gets colder
negative correlation
margarine consumption and divorce rates
RANDOM CHANCE
studies show that people how have more birthdays live longer
Reserve Causality
A relationship where one variable (independent variable) CAUSES (is responsible for the occurrence) the other (dependent variable)
Causation
decapitation causes death
causation
T or F: Generally, it is very difficult to prove a causal relationship
TRUE
days with higher ice cream sales have more cases of drowning. What is the confounding factor?
warmer weather and swimming is the confounding factor
An additional variable causes the change in the dependent variable
Confounding factors
Not all associations are causal: Associations may APPEAR causal due to:
- confounding factors
- chance
- BIAS
Anything that systematically influences the conclusion or distorts comparisons
BIAS
Systematic differences between groups
selection bias
what is selection bias likely due to?
Likely due to inadequate randomization
Systematic differences in the care provided apart from the intervention being assessed
Performance Bias
Ex. Participants in the treatment group spend 10 hours with the researchers, and the control group spends 1 hour
Spending time with the researcher is therapeutic
Systematic difference in withdrawals from the trial
Attrition Bias
Ex. Participants who have a negative reaction (or no benefit) from the study treatment drop out more often than the people who find the treatment helpful
Inflates the positive result
Attrition Bias
Systematic differences in outcome assessment
Detection Bias
Ex. Study of the effects of working with radioactive material on skin cancer risk. More cases of skin cancer were discovered in patients who reported working with radioactive material.
Looking “harder” for one outcome than another
Detection Bias
A researcher genuinely believes that the study drug will help psoriasis. If they know who is receiving the real drug, they may underestimate when measuring the psoriasis skin lesion
Detection bias
what strategy helps to eliminate detection bias?
Blinding
- the researcher should not know who is receiving the drug the placebo
When participants are aware of being observed, they alter their behaviour
Observation Bias
Ex. DIET DIARY
studies with negative findings are less likely to be submitted and published
Publication Bias
When asked about things in the past, may have difficulty remembering and respond in an inaccurate way
Recall Bias
Principles of Causation:
- Temporality
- Strength
- Dose-response
- Reversibility
- Consistency
- Biological plausibility
- Specificity
- Analogy
The cause came before the effect
Temporality
what are some study types that are limited in ability to detect temporality?
cross-sectional and case-control
Stronger association is better evidence of cause/effect relationship
Strength of association
Varying amounts of the cause result in varying amounts of the effect
Dose-response Relationship
a number of cigarettes smoked per day and lung cancer risk. What is the risk of confounding here?
Dose-response Relationship
RISK OF CONFOUNDING: heavy smokers more likely to consume more alcohol
The association between the cause and the effect is reversible
Reversibility
Ex. people who quit smoking have a low risk of cancer. What is the possible confounding here?
Reversibility
confounding: people who quit may start other healthy lifestyle behaviours too!
Several studies conducted at different times, in different settings and with different kinds of patients all come to the same conclusions
Consistency
If the relationship between cause and effect is consistent with our current knowledge of the mechanisms of disease
Biological Plausibility
Challenges: homeopathy and energy medicine
When biological Plausibility is present, does it strengthen the cause for effect?
YES!
One cause → one effect (A only causes B) =stronger evidence
Specificity
Vitamin c deficiency → scurvy
Specificity
What is an example where Specificity is weak evidence against the cause?
smoking causes cancer
bronchitis, periodontal disease
The cause-and-effect relationship is strengthened if there are examples of well-established causes that are analogous to the one in question
Analogy
Ex. if we know a virus can cause chronic, degenerative CNS disease (Subacute Sclerosing Panencephalitis) it is easier to accept that another virus might cause degeneration of the immunologic system (e.g. HIV and AIDS)
Analogy is (strong/weak) evidence for cause
WEAK!
what is near the top of the hierarchy?
meta-analysis
Order of hierarchy?
top to bottom
Clinical practice guidelines
Meta
RCT
Cohort
Case-control
Case report
Animal and lab studies
Do something to the patient, observe what happens
Experimental/intervention studies
does the treatment change the likelihood of the outcome?
Randomized controlled trial
- defined population (inclusion/exclusion criteria)
- 2+ groups: treatment and comparison
- is prospective
Key Features of RCT
Randomized: Equal chance of being assigned to the intervention or control group - balanced baselined characteristics - sex, family history, age,
Control group: accounts for natural course of illness, placebo effect, confounding factors
May have blinding: minimize expectation effect
RCT Use:
Best Design for confirming cause/effect
Cross-over:
everyone gets intervention AND comparison
Where would cross-over not be best?
Acute - would become resolved after medication A
self - limited
Only works for things that respond temporarily to medication
Intervention study where participants are given the option between arms
Preference (controlled, not randomized)
(ex. Cancer survivors: pick MBT or Tai Chi)
intervention study where everyone gets the intervention (knows it), and assesses changes before and after the intervention
Open-label, Pre/Post
OBSERVATIONAL STUDIES
Exposure NOT controlled by the researcher
They ask: Is there a relationship between a risk factor (or health factor) and an outcome (harm or benefit)
Ex. Is a high intake of blueberries associated with a lower risk of cancer? Is increased stress associated with an increased risk of a heart attack?
OBSERVATIONAL STUDY
Types of observation studies:
- cohort
- case-control
- cross-sectional
highest quality observational study
cohort
COHORT STUDY
- Recruit the cohort (outcome is NOT present)
- Assess risk/health factors (create a comparison group)
- Follow over time
- See who develops the outcome
- “longitudinal” “prospective”
Compare INCIDENCE
People without CVD.
High saturated fat diet, low saturated fat diet. Who developed CVD?
cohort
Case-Control
The outcome is PRESENT at the beginning of the study
RETROSPECTIVE
Looks backward in time for exposure (how much meat did you eat 10 years ago?)
Find people with AND and CVD. Ask them to think about the past. High or low saturated diet. Is there a difference?
Case-control
Case-control strengths
- can look at rare outcomes
- faster (no waiting times, minimal loss of participants)
Case-control weaknesses
Assignment to comparison group is NOT random
- there could be differences (confounding factors)
Hard to assess temporality (ex. recall bias)
CROSS-SECTIONAL STUDIES
The outcome is PRESENT at the beginning of the study
- assess exposure and outcome at ONE time point
Ex. Patients with CVD and healthy controls, ask about CURRENT meat intake
Find people with AND without CVD
Ask about saturated fat in diet
Is there a difference?
Cross-sectional
Strengths of observational studies
- can study any questions
don’t have to purposefully deprive pregnant women of B12, can look at people who are already doing this - can be less expensive or faster
CASE Reports, Case Series
- Report previously undocumented events (success, adverse reaction)
- May lead to further action
- Real patients and real clinical approaches
- BUT concerns about bias and generalizability
Preclinical Studies:
outside the body: cell lines, organs
In vitro:
in the non-disease model: healthy human to study pharmacokinetics (absorption, elimination), animal models
In vivo:
Types of Synthesis Research
Narrative
Systematic
Meta-analysis
Narrative Reviews
- Researcher combines some of the research on a topic
- reports on the collection of evidence
- often does NOT describe how they searched and how they decided to include certain studies
- HIGH risk of bias - results often consistent with their hypothesis
Systematic reviews
Explicit and rigorous methods to:
1. identify (2+ databases, specific inclusion/exclusion criteria)
2. critically appraise
3. synthesize (combine)
Scientific investigation with pre-planned methodology
Enormous effort to minimize bias
Meta-Analyses:
Statistically combine the results of studies in a systematic review
Goes one step further - combines the data
Visual representation of the studies (Forest Plot)
Ex. 5 studies with 20 participants → 1 study with 100
participants
WHOLE PRACTICE RESEARCH NEED:
- Do the results of RCTs apply to real clinical practice of
naturopathic medicine? - Issues: RCT often use one intervention to treat one disease in
a uniform patient population - Naturopathic medicine: often complex interventions,
prescribed in an individualized way, to patients with complex
health conditions