Ch 6 Flashcards

1
Q

Purpose of evaluating the evidence

A

Important in becoming a successful evidence-based practitioner

◦ Need to differentiate which studies are worth CHANGING YOUR PRACTICE

◦Which are valid and clinically useful

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Qualitative Methods Appraisal – Key Questions

A

1.Was the sample appropriate?
2.Were the data collected appropriately?
3.Were the data analyzed appropriately?
4.Can I transfer the results of this study to my own setting?
5.Does the study adequately address potential ethical issues?
6.Overall: is what the researchers did clear?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Combination of two different research designs can be very informative.

A

Mixed-methods

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

True/False: The most appropriate design to be used in a specific study depends more on the research question and the knowledge required than on a priori ideas of best methods.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Ethnography, phenomenology, grounded theory

A

Study design

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Participant observation, interviews, focus groups, and review of documents or other material

A

Method

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Sampling redundancy or theoretical saturation of the data is achieve

A

Sampling

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Research Design Issues in Qualitative Research

A

Study design
Methods
Sampling
Data collection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Critical Appraisal – Qualitative Studies

A

◦Levels of evidence do not apply

◦Use critical reviews forms for analysis – example – McMaster Qualitative Review Forms (Appendix C in the book)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

The “Classic” Levels of Evidence for Treatment Effectiveness

A

Level 1 - Systematic reviews, randomized
controlled trials
Level 2 - Cohort studies
Level 3 - Case control studies
Level 4 - Case series
Level 5 - Expert opinion, bench research, or
theoretical principles

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Certain consistencies are evident across different types of questions and their associated levels of evidence:

A
  1. A systematic review of high-quality studies always provides the highest level of rigor.
    2.An individual study using the optimal design for that type of clinical question is considered level 1.
    3.Prospective data collection indicates higher study quality than retrospective data collection.
    4.Expert opinion, bench research, conceptual frameworks/theories/first principles are always considered the lowest (level 5) evidence
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Classic evidence rating system

A

Grades A through D

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Consistent level 1 studies supporting a given conclusion.

A

Grade A recommendation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Consistent level 2 or 3 studies supporting a given conclusion

A

Grade B recommendation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Level 4 studies or extrapolations from level 2 or 3 studies.

A

Grade C recommendation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Only level 5 evidence or at any time when the available studies are inconsistent and inconclu

A

Grade D recommendation

17
Q

classifies the quality of evidence in one of four levels—high, moderate, low, and very low. Some of the organisations using the GRADE system combine the low and very low categories. Evidence based on randomised controlled trials starts as high quality evidence, but may be reduced for compromised quality including: study limitations, inconsistency of results (between studies), indirectness of evidence, imprecision (in effect estimates), and reporting bias.

A

GRADE system

18
Q

◦Sample Size
◦Allocation of treatment
◦Blinding (to treatment allocation and/or outcomes)
◦Outcome ascertainment
◦Follow-up
◦Statistical analysis

A

Sampling of subjects

19
Q

Critical appraisals tools

A

◦Quick scales or classification systems, or in some cases using more detailed rating scales
◦Range from very structured tools that contain specific questions and defined response categories to more open-ended scales where the assessor makes guided subjective judgments on the quality of aspects of study design using a framework provided by the assessment tool
◦Clinicians can select different critical appraisal instruments depending on the the clinical question/study design, their familiarity with critical appraisal, personal preferences, and the nature of the ratio betw

20
Q

Types of critical appraisal tools

A

◦Open-ended questions versus semi-structured (e.g., Law) versus structured (e.g., MacDermid)

◦Interpretation of items guided or open

21
Q

Learning Biostatistics and Critical Appraisal

A

Need to have a working knowledge of biostatistics to comprehend, analyze, and put into practice evidence from clinical journals and other sources.
◦Critical appraisal is most useful and most interesting when conducted in a group situation.
◦Journal Club