EBP EXAM 2:2 Flashcards

1
Q

Internal Validity

A

Did the INDEPENDENT VARIABLE cause the EFFECT?

Or was the effect caused by confounding or extraneous variables. Did it measure what it intended to measure?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

External Validity

A

Generalizability to others within the same demographic, ailment, etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

When appraising QUANTITATIVE research, what will the APPRAISER consider?

A

Challenges in study design, complexity, practical and ethical considerations, study quality, internal validity, study relevance, etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the FIRST question to consider when appraising QUANTITATIVE research?

A

Was the design appropriate?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is a systematic review?

A

The collection and rigorous analysis and appraisal of literature in a given area

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the difference between a systematic review and a meta-analysis

A

A meta-analysis is like a systematic review, but it goes a step further and examines all of the numerical & statistical results from the studies.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Describe a RCT

A
  1. People are randomly assigned to an intervention or control group
  2. They are pretested on D.V
  3. Treatment is provided to intervention group while no treatment or an alt. treatment is given to control
  4. compare results
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Describe a COHORT study

A

A natural occurring group with two groups (cohorts) one have I.V is followed to see if they do or do not have an outcome (dependent variable)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Cohort vs Case Controlled studies

A

Case controlled studies find people with the dependent variable first, then look to see if the I.V is related.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

List common biases in systematic reviews and methods researchers can use to address them.

A
  • Publication bias: study is more likely to be published if it showed positive results, more statistical significance
  • All relevant studies are not included
  • Only one reviewer or appraiser: need to have more than one person to look at strength of evidence and can help appraise systematic review.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

List common biases in meta-analyses and methods researchers can use to address them.

A
  • Lack of universal definitions

* Lack of consistent measures for outcomes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

List common biases in RCTs and methods researchers can use to address them.

A

• RESEARCHERS AND/OR PARTICIPANTS KNOW/GUESS TREATMENT GROUP
o Ways to avoid: conceal randomization, masking/blinding,
• DROPOUTS/ATTRITION
o Ways to avoid: intention to treat analysis
• CO-INTERVENTION
o Ways to avoid: have good exclusion criteria
• CONTAMINATION (CONTROL GROUP GETS THE TREATMENT INADVERTENTLY)
o Ways to avoid: implement controls in the study
• DIFFERENT THERAPISTS (MIGHT DO THINGS DIFFERENTLY, HAVE BETTER SKILLS, NOT GIVING SAME KIND OF THERAPY)
o Ways to avoid: training, only use one therapist
• SITE OF INTERVENTION (HOME VS. SNF)
o Ways to avoid: have consistent sites

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Name some additional considerations when appraising the quality of RCTs.

A
  • Inappropriate generalization (Study is too controlled. not every client is the same, affects ability to generalize)
  • Lack of longitudinal follow-up (had good findings, but do not know how long effects last after invention is over)
  • Feasibility (not feasible to do therapy 4 hrs/day, 5x/week)
  • Not appropriate for all questions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

List common biases in cohort studies and methods researchers can use to address them.

A

· Selection bias (naturally occurring groups)
· Inconsistent or inaccurate data collection
· Attrition (going to have more dropouts over time if study is long term)
· Sufficient length of time (was study long enough to see a difference?)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

List common biases in case-controlled studies and methods researchers can use to address them.

A

· Recall/memory bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

In addition to design considerations, what other aspects of studies need to be considered when appraising internal validity?

A

· Was the statistical approach appropriate to answer the question?
· What were the results? Are the findings statistically significant? (probability)
· Were the conclusions supported by the study findings?

17
Q

What questions should guide the appraisal of a study’s external validity?

A

· Was the sampling plan appropriate?
· Was the sample size adequate?
· Nonresponse/dropout

18
Q

What questions should guide the appraisal of a study’s impact/clinical utility?

A

· Are the findings clinically significant?
· How large were the treatment effects or effect size?
· Does the evidence pertain to my clinical situation?
· Can the therapeutic intervention be implemented in my clinical setting?

19
Q

What are some take-home messages from this presentation to keep in mind as you appraise your articles?

A

· Rigorous studies, free of bias are the best evidence
· Few studies will meet all of the standards
· Many studies will almost, but not quite, answer the PICO question
· Research studies will usually not replicate one another exactly
· Consider whether the evidence is strong, moderate, or weak