Test 2 Flashcards

1
Q

How must we consider research evidence?

A

in light of patient concerns & preferences AS FILTERED through clinical expdertise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is evidentialism?

A

a framework for understanding how RESEARCH EVIDENCE and PT concern/history, cultural context & disease trajectory are used in DECISION MAKING PROCESS by clinicians to determine a POC

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Can experiences of the clinician and/or pt be considered evidence?

A

YES! internal evidence & pt preferences, respectively

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the central themes of evidentialism?

A

clinicians must FIND & KNOW the information upon which decisions are made, and decisions must be based on CURRENT EVIDENCE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the first assumption of evidentialism?

A
  1. only decisions that result from thoughtfully responsible behavior are justified
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the second assumption of evidentialism?

A

external evidence that supports a conclusion about pt care MAY NOT BE SUFFICIENT to make that conclusion clinically significant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the third assumption of evidentialism?

A

as the available external evidence changes, the decision maker’s responses should change too

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the fourth assumption of evidentialism?

A

decisions made from a set of external evidence depend on VALIDITY, RELIABILITY AND APPLICABILITY of the evidence AND on the clinician’s clinical wisdom

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

what is clinical wisdom?

A

the clinician’s grasp on/understanding of the evidence and how it fits in your practice

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the fifth assumption of evidentialism?

A

having a belief without supportive external evidence is UNJUSTIFIED.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are the three skill sets that comprise the practitioners expertise in making EBP decisions?

A
  1. CLINICAL: knowledge skills & experience related to direct practice with clients
  2. TECHNICAL: knowledge, skills & experience related to formulating questions, conducting research & evaluating findings
  3. ORGANIZATIONAL: knowledge, skills & experience related to teamwork & leadership
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is quality improvement?

A

process used to identify and resolve performance deficiencies…focuses on IMPROVING OUTCOMES in the delivery of care

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the question QI asks?

A

how can we do better?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the relationship between QI, EBP and research?

A
  1. research GENERATES evidence
  2. EBP IMPLEMENTS evidence
  3. QI EVALUATES how well the EBP is working
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is quality assurance?

A

planned, systematic activities that assure quality requirements of a product/service are fulfilled
-give confidence/guarantee

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is quality control?

A

observation techniques & activities used to fulfill quality requirements; evaluate and guide the process and correct if necessary

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

what is outcomes management?

A

technology of pt experience designed to help providers & pts make rational healthcare choices based on insight into the effect of choices on the patient’s life

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

what basic question does outcomes management ask?

A

what is best for the patient, and how do we get there?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

how do we support outcomes management?

A

emphasize practice standards that providers use to select interventions, measure pt functional status& well being & clinical outcomes, aggregate data, analyze & disseminate outcomes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What type of data would the quality management department have?

A

IRs, pt satisfaction scores, data collected for accrediting bodies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

what data would the finance department have?

A

charges for tests/meds/equipments, pt days, readmission rates, pt demographics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

what data does HR have?

A

staff turnover, staff education, contract labor use, provider skill level, staff ratios

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What data does clinical systems have?

A

diagnostic test results & pharmacy data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

what data does administration have?

A

pt complaints

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
what data does the EHR have?
pt level info captured through documentation of clinical care
26
What do we evaluate when looking at developing new measurement instruments?
the instrument's VALIDITY & RELIABILITY
27
What is validity?
is the instrument actually measuring what it is supposed to measure
28
What is content validity?
the minimum demonstration of validity needed (reflected through a panel of experts that review the instrument)
29
What is reliability?
does the instrument measure the construct consistently every time it is used?
30
what is Cronbach's alpha?
if 0.80 or greater, indicates that instrument should be reliable
31
What is nominal data?
data is sorted into categories, numbers are only used for labeling
32
what are examples of nominal data?
gender, presence or absence of a quality
33
what is the lowest level of data measurement?
nominal
34
what is ordinal data?
when data can be ranked in order, but the absolute difference between each level is not equal
35
what is an example of ordinal data?
likert scales
36
What are interval data?
numeric data with equal and consistent math values separating each measurement point; lacks an absolute zero
37
what is an example of interval data?
Fahrenheit temp scale
38
what is ratio data?
same data characteristics as interval-level data, but also has an absolute zero
39
what is an example of ratio data?
Kelvin temp scale *this type not usually used in healthcare*
40
How do we present QI in an understandable way to stakeholders?
1. scorecards | 2. dashboards
41
What are scorecards?
show how indicators compare against each other; allows for observation of intended & unintended consequences
42
What are dashboards?
compare hospital to hospital, indicators are focused on performance; shows a snapshot of how care provided feeds into the overall hospital performance
43
When do we need IRB approval?
- any study involving human subjects - any studies involving PHI (per HIPAA) - if it is possible that knowledge might be shared outside of the institution
44
what is beneficence?
the importance of DOING GOOD for patients
45
what is nonmaleficence?
addressing the importance of DOING NO HARM to patients
46
what is autonomy?
pt has right to make decisions about their health, lives, bodies
47
what is justice?
fairness; resources should be distributed fairly among people and without prejudice
48
what core ethical principles go with safety?
nonmaleficence
49
what core ethical principles go with effectiveness?
beneficence
50
what core ethical principles go with patient-centeredness?
autonomy
51
what core ethical principles go with timeliness?
beneficence & nonmaleficence
52
what core ethical principles go with equity?
justice
53
what core ethical principles go with efficiency?
beneficence & nonmaleficence
54
when can EBQI initiatives conflict with ethical principles?
1. attempts to improve quality for some pts may harm others 2. strategies to improve quality may be ineffective & waste resources 3. activities that are called QI may be more accurately described as clinical research
55
what is clinical research?
activities involving direct interaction by investigators with human subjects or material of human origin
56
what does clinical research generate?
knowledge on which practice should be based?
57
is clinical research meant to be generalizable?
YES
58
what are evidence-based quality improvement (EBQI) initiatives?
systematic, evidence-based activities designed to immediately improve healthcare delivery in specific settings
59
are EBQI projects generalizable?
NO
60
what are the similarities between EBQI and clinical research?
- both have human participants - both use similar data collection procedures to eval outcomes - both may use same data analysis methods
61
Is participation optional/voluntary for clinical research?
YES, consent is needed
62
is participation voluntary/optional for EBP/EBPQI?
NO, all pts receive same evidence-based intervention as part of routine care
63
is the IRB involved in clinical research?
YES
64
is the IRB involved in EBPQI?
NO
65
what is the aim of clinical research?
to generalize findings to a population wider than the research subjects
66
what is the aim of EBP/EBPQI?
to improve care of patients in a specific organization/setting
67
who is the intended audience for clinical research?
people outside the organization who we can disseminate to
68
who is the intended audience for EBP/EBPQI?
provide internal data to practitioners to guide further practice changes
69
is research an integral part of clinical practice?
no
70
does research have a risk for patients?
yes, HIGH risk
71
does EBPQI have a risk to the pts?
no, or very low risk
72
is EBPQI an integral part of clinical practice?
YES
73
for EBP, research or EBPQI to be worth doing...
it must have value to our patients and our practice!
74
how should inclusion and exclusion criteria be decided for recruiting study participants?
should be based on SCIENTIFIC RATIONALE, not convenience or vulnerability
75
how should we determine the pts involved in EBPQI?
should be determined by the population of pts served by the organization, NOT on the ability to generalize the outcomes
76
what is important about a risk-benefit ratio for all types of studies?
ratio should be FAVORABLE...want to minimize risks and maximize gains
77
what is important about independent review of research, ebp and ebpqi?
it is ethically required because of potential human rights violations and conflicts of interest
78
is it in the pts rights to refuse ebp treatment?
YES; well-being of individual takes precedence
79
when do we need informed consent?
for clinical research studies, NOT FOR EBP OR EBPQI
80
controversy about ethical principles as applied to EBQI versus clinical research is seen most clearly in which of the domains of ethics?
independent review
81
What is a case study?
report that describe the history of a single patient (or small group of patients) usually in the form of a story
82
What do case studies focus on?
one aspect of a condition, usually something rare
83
Is there a control group for a case study?
NO, meaning there is no statistical validity
84
What are disadvantages to case studies?
lowest level of evidence, based on few clinical cases, unable to generalize to gen pop
85
What are advantages to a case study?
may alert HCPs to new issues and rare/adverse events, can assist in hypothesis generation
86
What is another term for a case study?
case report
87
what is a case control study?
a RETROSPECTIVE investigation that selects individuals with an outcome and looks back to identify the possible conditions or characteristics associated with outcome
88
what are two groups in case control studies?
CASE: have condition CONTROL: don't have condition
89
What are the advantages to case control studies?
can be done quickly, efficient for conditions with rare outcomes
90
What are disadvantages to case control studies?
cannot directly identify absolute risk of a bad outcome
91
What is another name for a cohort study?
longitudinal study
92
What is a cohort study
study in which a sample exposed to a treatment or condition is followed over time for a presumed outcome
93
Is there always a comparison group in a cohort study?
NO, not always
94
Are cohort studies prospective or retrospective?
BOTH; prospective is followed into the future, retrospective is when past medical records are used to identify exposure factors
95
What is a famous example of a cohort study?
Framingham heart study
96
What are the strengths to cohort studies?
establish risk or causation directly, can identify multiple outcomes, can study rare exposure
97
What are weaknesses of cohort studies?
requires large sample size, not good for rare diseases, requires long period of time, can't study multiple exposures
98
What is the gold standard of study designs?
Randomized control trials (RCTs)
99
What is an RCT?
compares effectiveness of different interventions; the treatment group receives treatment under investigation and control group receives placebo or standard treatment, participants are randomly assigned to avoid bias
100
What is a crossover design?
group is given experimental intervention, then the comparison intervention (both groups receive both treatments)
101
What is a double blind RCT?
neither medical staff nor patient know which of the treatments/therapies the patient is receiving
102
What type of RCT is the most rigorous clinical research design?
double blind! reduces bias via randomization and eliminates placebo effect
103
What are advantages to RCT?
reduce bias
104
What are disadvantages of RCTs?
expensive (lots of time and money)
105
What are the types of research evidence reviews?
narrative reviews, integrative reviews, meta-analysis and systematic review
106
What is a narrative review?
a "snap shot" of a clinical problem or issue that is scientifically controversial
107
Is there a systematic approach to a narrative review?
NO, it is the authors point of view and then research that supports it
108
What is an integrative review?
summarizes other articles or previously researched studies (observational studies, qualitative, RCTs, clinical expert opinions, etc.)
109
What is the main goal of an integrative review?
identify the need for future research?
110
Is there a summary statistic for an integrative review?
NO due to limitations in the studies found
111
What is the strongest type of evidence?
systematic review
112
What is a systematic review?
comprehensive survey of a topic, collects and critically analyzes multiple studies and summarizes exsiting clinical research
113
Can systematic reviews be generalized?
YES
114
What is the advantage of a systematic review?
cost effective and accurate!
115
what are disadvantages of systematic review?
time consuming
116
What is a meta-analysis?
method used to obtain an aggregated effect of summarized results of multiple studies
117
what does a meta-analysis create?
ends with ONE SUMMARY STATISTIC for all quantitative articles studies
118
What are advantages of meta-analyses?
very large sample size, increased accuracy
119
What are disadvantages of meta-analyses?
all research used must be quantitative, reliable and comparable to one another
120
What is a visual representation of the data summarized in a meta-analysis?
BLOBOGRAM or forest plot
121
What does a circle in a forest plot represent?
measure of the effect of an individual study (size changes with increased contribution)
122
What does a horizontal line represent in a forest plot?
confidence interval
123
what does a diamond represent in a forest plot?
summary treatment effect of all studies (vertical point is result, horizontal points are CI for result)
124
What is a confidence interval?
a range inside which clinicians can be reasonably confident of the precision and repeatability of findings (usually 95-99%)
125
What is an odds ratio?
odds of treatment group having the outcome
126
What is an intervention group?
group receiving the new treatment
127
what is the control group>?
group that does not receive new treatment (receives standard or no treatment)
128
what is the hypothesis/aim/purpose?
a proposition/theory that is the starting point for investigation
129
what is the null hypothesis?
the hypothesis that there is NO significant difference between control and intervention group
130
what is the power analysis?
statistically how many people are needed to be in the study to make it trustworthy
131
what is validity?
that the instruments used are measuring what they are supposed to be measuring
132
what is reliability?
that the instruments used will perform the same way each time they are used under the same conditions
133
what are the two types of study validity?
internal and external validity
134
what is internal validity?
the extent to which it can be said that the independent variable (intervention) causes a change in the dependent variable (the outcome) and the results are NOT DUE TO OTHER FACTORS
135
what is external validity?
the ability to generalize the findings from a study to the larger population
136
what is instrument validity?
refers to whether the study tools are measuring what they are supposed to be measuring
137
what is study reliability?
refers to the repeatability of the study (if the study was repeated under the same circumstances, would the results be the same)
138
if a quantitative study is not repeatable...
it is not RELIABLE
139
what is instrument reliability?
refers to the consistency of the instruments...will they yield the same results under the same conditions?
140
what does the critical appraisal of quantitative studies focus on?
1. validity of the study 2. reliability of the study 3. applicability of the study
141
what is a study bias?
anything that disorts study findings in a systematic way arising from the methodology of the study; can compromise validity of findings
142
what are the 8 types of study bias?
1. selection bias 2. knowledge of who receives the intervention 3. gatekeeping 4. measurement bias 5. recall bias 6. information bias 7. loss to follow up (attrition) 8. contmaination
143
what is selection bias?
can occur if participants are selectively assigned to a group
144
how can we reduce selection bias?
randomly assign participants to groups
145
what is bias related to knowledge of who receives the intervention
occurs if subjects, or those measuring outcomes, know subject group assignment
146
how do we reduce bias of knowledge who receives the intervention?
double blinding (both subject and those measuring outcomes do not know group assignment)
147
what is gatekeeping bias?
occurs in convenience sampling when pts are chosen because they are more likely to volunteer (sample may not be representative of target population()
148
what is measurement bias?
can occur if instruments are incorrectly calibrated or if data collectors deviate from established data collection protocols
149
what is recall bias?
can occur when subjects are asked to recall past actions or events...subjects may give answers they think are socially acceptable
150
what is information bias?
can occur in longitudinal studies if participants know they are the subjects of the study, they may unintentionally act differently
151
what is attrition?
loss to follow up, can occur due to unforeseen side effects of intervention or burdensome data collection procedures
152
what is contamination?
can occur if intervention and control groups have interaction and information is shared (especially with educational studies)
153
what are confounding variables?
when relationship between two variables is actually due to a third unknown or unconsidered variable
154
what does reliability mean?
repeatability
155
what is the magnitude of effect?
degree of difference in effect between study groups
156
what is effect?
rate of occurrence in each group for the outcome of interest (how much does the treatment affect people)
157
what is a p value?
denotes statistical significance; the probability that the null hypothesis is true, smaller p = less likely findings occurred by chance
158
what is a confidence interval?
range in which true effect lies within a given degree of certainty...a range in which clinicians can be reasonably confident that they will find a result when they implement study findings in their own practice
159
what are the questions to ask to determine if the study was reliable?
1. do numbers add up? 2. magnitude of effect 3. strength of association 4. clinical significance and clinically meaningful 5. precision of the measurement of effect
160
what does qualitative research address?
clinical questions that address human responses and meaning, patient preferences and values
161
what type of data is collected with qualitative research?
subjective
162
what are types of qualitative research?
1. ethnography 2. grounded theory 3. phenomenology 4. hermeneutics
163
what is ethnography?
the study of a social group's culture through combining participant observation, interviews and collection of artifacts; DESCRIPTIVE
164
does ethnography aim to create generalizable results?
NO, they aim to generate questions for research that will be followed up with different methodology
165
what are common terms used with ethnography?
fieldwork, participant observation, culture, key informant
166
what are examples of ethnographic groups?
adult patients in trauma center, kids with a disease, etc.
167
what is grounded theory?
tries to typify or categorize research, describes a process by which a group moves through experiences over time and addresses how people deal with life situations
168
what is empirical evidence?
information acquired by observation or experimentation
169
what is phenomenology?
the study of the essence/meaning through descriptions of lived experience
170
what does phenomenology address?
questions about how people perceive their experiences in a specific situation to better understand a phenomenon
171
what is phenomenological reduction?
process of reflection, imagination and intuition (when you take information and break it down)
172
what must researchers do in a phenomenological study?
introspection: recognize own feelings bracketing: suspend their own beliefs about the experience
173
what is hermeneutics?
philosophy/practice of interpretation; views human lived experiences via the interpreter's dialogical engagement
174
what is hermeneutics linked to?
phenomenology
175
why does hermeneutics have bias?
because you are internally interpreting someone else's experience and then writing about it
176
what are common data collection tools for qualitative research?
observation/fieldnotes, interviews/focus groups, narrative analysis, sampling strategies, data management, mixing methods
177
what is participant observation?
when the participant is also observing...can range from complete observer to complete participant
178
what are fieldnotes?
highly detailed records of all that can be remembered of observations, researcher actions and researcher-participant interactions
179
what are analytic notes?
written by researchers about ideas for analysis, issues to pursue, people to contact (regarding the study itself, not what is being studied)
180
what are the three types of interview?
1. unstructured, open-ended 2. semistructured 3. structured, open-ended
181
what are unstructured, open-ended interviews?
allows informants the fullest range of freedom to describe their experiences
182
what are semistructured interviews?
use a flexible question structure that allows informant to answer in their own way
183
what are structured, open-ended interviews?
question format with little flexibility in the way the questions are to be asked, but informants can respond on their own terms
184
what is a focus group?
group interviews that generate data on designated topics through discussion and interaction
185
what is narrative analysis?
generates and interprets stories about life experiences that allows understanding of interview data
186
what is content analysis?
breaks down data by coding, comparing and categorizing information, then reconstituting it into a new form
187
what are the types of sampling strategies?
1. purposeful/purposive 2. theoretical 3. nominated/snowball 4. volunteer/convenience
188
what is purposeful/purposive sampling?
uses intentional selection of people in accordance with needs of the study
189
what is theoretical sampling?
form of purposeful sampling used in grounded theory
190
what is nominated/snowball sampling?
recruits participants with the help of people already involved in the study
191
what is volunteer/convenience sampling?
obtains participants by solicitation or advertising for volunteers who meet study criteria
192
what is qualitative data management?
uses systems that organize, catalog, code, store and retrieve data, guides how researchers approach analysis
193
what is computer-assisted qualitative data analysis?
area of technologic innovation that has resulted in word processing and software to support data management
194
what is qualitative data analysis?
variety of techniques that assist in movement back and forth between data and ideas throughout the research process to finally reach synthesis
195
who can do mixed methods studies?
only experienced researchers who truly understand the paradigms of both quantitative and qualitative research
196
what is mixed methods?
combines qualitative and quantitative methods; useful for focusing on research questions that call for understanding of real-life, multilevel, cultural perspectives on intervention success
197
how do we appraise ethnographic field studies?
time, place, social circumstance, language, intimacy, consensus
198
how do we appraise grounded theory?
fir, grab, work, modifiability
199
how do we appraise phenomenology?
orientation, strength, richness, depth
200
how do we appraise qualitative studies based on trustworthiness?
1. credibility 2. transferability 3. dependability 4. confirmability
201
what is credibility?
accuracy and validity assured by thorough documentation
202
credibility in qualitative studies is to ______ in quantitative studies
internal validity
203
what is transferability?
information that is sufficient for a research consumer to determine if findings are meaningful to other people in similar situations
204
transferability in qualitative studies is to _____ in quantitative studies
external validity
205
what is dependability?
a research process that is carefully documented to provide evidence of how conclusions were reached and whether a researcher might obtain similar findings in similar condition
206
dependability in qualitative studies is to _____ in quantitative
reliability
207
what is confirmability?
providing substantiation that findings and interpretations are grounded in the data
208
confirmability in qualitative research is to ____ in quantitative research
objectivity
209
what is a meta-synthesis?
comparative analysis of individual qualitative interpretations, gives us a summary statistic of several studies
210
what are questions to ask when critically appraising qualitative studies?
1. validity/trustworthiness of results 2. implications of research 3. effect on reader 4. results of study 5. what is the study approach 6. significance/importance of study 7. sampling strategy 8. clear data collection procedures 9. description of data analysis procedures 10. presentation of findings 11. overall results 12. how do results help me in caring for my patients