Research Test Flashcards

1
Q

most treatment decisions not based on systematic review of evidence

A

Archie Cochrane

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

The term “Evidence-Based Medicine” was introduced in

A

1992

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Purpose of “Evidence-Based Medicine”:

A

Shift decision-making from “intuition, unsystematic clinical experience, and pathophysiologic rationale” to increase use of scientific, clinically relevant research

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Decision-making is the bringing together of what three components?

A
  1. clinical expertise
  2. best research evidence
  3. patient values and preferences
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Benefits of Evidence-Based Practice?

A
  1. avoid biases from clinical experience
  2. used vast amount of literature
  3. efficient use of resources
  4. improved clinical care
  5. builds confidence in treatment
  6. stop ineffective practices
  7. builds consistence within/across professions
  8. promotes inquiry, continual improvements
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Downsides of EBP?

A
  1. may reduce treatment options (lack of funding for research)
  2. challenging to study complex situations and interventions
  3. concerns about undermining naturopathic philosophy (individualized treatments)
  4. doesn’t capture significance
  5. gold standard studies are expansion and don’t always exist
  6. reduced emphasis on professional judgement
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Doing EBPs: 5 A’s

A

Ask, acquire, appraise, apply, assess
ASK: formulate an answerable research question
ACQUIRE: find the best available evidence
APPRAISE: critically appraise/evaluate the evidence
APPLY: apply the evidence by integrating with clinical expertise and patient’s values
5. ASSESS: evaluate performance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Essential to understand and critically evaluate research to apply it properly. Conclusions from research studies may reflect the truth

A

Critically appraise

Presentation in the media aimed at generating attention and interest rather than accuracy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

T or F: all research is open to bias

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

The scientific method:

A
  1. observation/question
  2. research topic area
  3. hypothesis
  4. test with experiment
  5. analyze data
  6. report conclusion
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

a measurement of the size and direction of the relationship between 2 or more variables.

A

correlation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

height and weight, taller people tend to be heavier

A

positive correlation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

mountain altitude and temperature, as you climb higher it gets colder

A

negative correlation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

margarine consumption and divorce rates

A

RANDOM CHANCE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

studies show that people how have more birthdays live longer

A

Reserve Causality

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

A relationship where one variable (independent variable) CAUSES (is responsible for the occurrence) the other (dependent variable)

A

Causation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

decapitation causes death

A

causation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

T or F: Generally, it is very difficult to prove a causal relationship

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

days with higher ice cream sales have more cases of drowning. What is the confounding factor?

A

warmer weather and swimming is the confounding factor

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

An additional variable causes the change in the dependent variable

A

Confounding factors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Not all associations are causal: Associations may APPEAR causal due to:

A
  1. confounding factors
  2. chance
  3. BIAS
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Anything that systematically influences the conclusion or distorts comparisons

A

BIAS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Systematic differences between groups

A

selection bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

what is selection bias likely due to?

A

Likely due to inadequate randomization

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Systematic differences in the care provided apart from the intervention being assessed
Performance Bias Ex. Participants in the treatment group spend 10 hours with the researchers, and the control group spends 1 hour Spending time with the researcher is therapeutic
26
Systematic difference in withdrawals from the trial
Attrition Bias
27
Ex. Participants who have a negative reaction (or no benefit) from the study treatment drop out more often than the people who find the treatment helpful Inflates the positive result
Attrition Bias
28
Systematic differences in outcome assessment
Detection Bias
29
Ex. Study of the effects of working with radioactive material on skin cancer risk. More cases of skin cancer were discovered in patients who reported working with radioactive material. Looking “harder” for one outcome than another
Detection Bias
30
A researcher genuinely believes that the study drug will help psoriasis. If they know who is receiving the real drug, they may underestimate when measuring the psoriasis skin lesion
Detection bias
31
what strategy helps to eliminate detection bias?
Blinding - the researcher should not know who is receiving the drug the placebo
32
When participants are aware of being observed, they alter their behaviour
Observation Bias Ex. DIET DIARY
33
studies with negative findings are less likely to be submitted and published
Publication Bias
34
When asked about things in the past, may have difficulty remembering and respond in an inaccurate way
Recall Bias
35
Principles of Causation:
- Temporality - Strength - Dose-response - Reversibility - Consistency - Biological plausibility - Specificity - Analogy
36
The cause came before the effect
Temporality
37
what are some study types that are limited in ability to detect temporality?
cross-sectional and case-control
38
Stronger association is better evidence of cause/effect relationship
Strength of association
39
Varying amounts of the cause result in varying amounts of the effect
Dose-response Relationship
40
a number of cigarettes smoked per day and lung cancer risk. What is the risk of confounding here?
Dose-response Relationship RISK OF CONFOUNDING: heavy smokers more likely to consume more alcohol
41
The association between the cause and the effect is reversible
Reversibility
42
Ex. people who quit smoking have a low risk of cancer. What is the possible confounding here?
Reversibility confounding: people who quit may start other healthy lifestyle behaviours too!
43
Several studies conducted at different times, in different settings and with different kinds of patients all come to the same conclusions
Consistency
44
If the relationship between cause and effect is consistent with our current knowledge of the mechanisms of disease
Biological Plausibility Challenges: homeopathy and energy medicine
45
When biological Plausibility is present, does it strengthen the cause for effect?
YES!
46
One cause → one effect (A only causes B) =stronger evidence
Specificity
47
Vitamin c deficiency → scurvy
Specificity
48
What is an example where Specificity is weak evidence against the cause?
smoking causes cancer bronchitis, periodontal disease
49
The cause-and-effect relationship is strengthened if there are examples of well-established causes that are analogous to the one in question
Analogy Ex. if we know a virus can cause chronic, degenerative CNS disease (Subacute Sclerosing Panencephalitis) it is easier to accept that another virus might cause degeneration of the immunologic system (e.g. HIV and AIDS)
50
Analogy is (strong/weak) evidence for cause
WEAK!
51
what is near the top of the hierarchy?
meta-analysis
52
Order of hierarchy?
top to bottom Clinical practice guidelines Meta RCT Cohort Case-control Case report Animal and lab studies
53
Do something to the patient, observe what happens
Experimental/intervention studies does the treatment change the likelihood of the outcome?
54
Randomized controlled trial
- defined population (inclusion/exclusion criteria) - 2+ groups: treatment and comparison - is prospective
55
Key Features of RCT
Randomized: Equal chance of being assigned to the intervention or control group - balanced baselined characteristics - sex, family history, age, Control group: accounts for natural course of illness, placebo effect, confounding factors May have blinding: minimize expectation effect
56
RCT Use:
**Best Design for confirming cause/effect**
57
Cross-over:
everyone gets intervention AND comparison
58
Where would cross-over not be best?
Acute - would become resolved after medication A self - limited Only works for things that respond temporarily to medication
59
Intervention study where participants are given the option between arms
Preference (controlled, not randomized) (ex. Cancer survivors: pick MBT or Tai Chi)
60
intervention study where everyone gets the intervention (knows it), and assesses changes before and after the intervention
Open-label, Pre/Post
61
OBSERVATIONAL STUDIES
Exposure NOT controlled by the researcher They ask: Is there a relationship between a risk factor (or health factor) and an outcome (harm or benefit)
62
Ex. Is a high intake of blueberries associated with a lower risk of cancer? Is increased stress associated with an increased risk of a heart attack?
OBSERVATIONAL STUDY
63
Types of observation studies:
1. cohort 2. case-control 3. cross-sectional
64
highest quality observational study
cohort
65
COHORT STUDY
- Recruit the cohort (outcome is NOT present) - Assess risk/health factors (create a comparison group) - Follow over time - See who develops the outcome - “longitudinal” “prospective” Compare INCIDENCE
66
People without CVD. High saturated fat diet, low saturated fat diet. Who developed CVD?
cohort
67
Case-Control
The outcome is PRESENT at the beginning of the study RETROSPECTIVE Looks backward in time for exposure (how much meat did you eat 10 years ago?)
68
Find people with AND and CVD. Ask them to think about the past. High or low saturated diet. Is there a difference?
Case-control
69
Case-control strengths
- can look at rare outcomes - faster (no waiting times, minimal loss of participants)
70
Case-control weaknesses
Assignment to comparison group is NOT random - there could be differences (confounding factors) Hard to assess temporality (ex. recall bias)
71
CROSS-SECTIONAL STUDIES
The outcome is PRESENT at the beginning of the study - assess exposure and outcome at ONE time point Ex. Patients with CVD and healthy controls, ask about CURRENT meat intake
72
Find people with AND without CVD Ask about saturated fat in diet Is there a difference?
Cross-sectional
73
Strengths of observational studies
- can study any questions don’t have to purposefully deprive pregnant women of B12, can look at people who are already doing this - can be less expensive or faster
74
CASE Reports, Case Series
- Report previously undocumented events (success, adverse reaction) - May lead to further action - Real patients and real clinical approaches - BUT concerns about bias and generalizability
75
Preclinical Studies:
76
outside the body: cell lines, organs
In vitro:
77
in the non-disease model: healthy human to study pharmacokinetics (absorption, elimination), animal models
In vivo:
78
Types of Synthesis Research
Narrative Systematic Meta-analysis
79
Narrative Reviews
- Researcher combines some of the research on a topic - reports on the collection of evidence - often does NOT describe how they searched and how they decided to include certain studies - HIGH risk of bias - results often consistent with their hypothesis
80
Systematic reviews
Explicit and rigorous methods to: 1. identify (2+ databases, specific inclusion/exclusion criteria) 2. critically appraise 3. synthesize (combine) Scientific investigation with pre-planned methodology Enormous effort to minimize bias
81
Meta-Analyses:
Statistically combine the results of studies in a systematic review Goes one step further - combines the data Visual representation of the studies (Forest Plot) Ex. 5 studies with 20 participants → 1 study with 100 participants
82
WHOLE PRACTICE RESEARCH NEED:
- Do the results of RCTs apply to real clinical practice of naturopathic medicine? - Issues: RCT often use one intervention to treat one disease in a uniform patient population - Naturopathic medicine: often complex interventions, prescribed in an individualized way, to patients with complex health conditions