HaDSoc Week 2 Flashcards

1
Q

What are the arguments for evidence based healthcare?

A

Ineffective and inappropriate interventions waste resources that could be used more effectively
Variations in treatment create inequities
Evidence shows that clinicians have often persisted in using interventions that are ineffective
Or failed to take up other interventions known to be effective
And tolerated huge variations in practice

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the reasons for variations in clinical practice or ineffective/inappropriate interventions?

A

Practices are influenced too much by:

  • professional opinion
  • clinical fashion
  • historical practice and precedent
  • organisational and social culture
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is evidence based practice?

A

Involves the integration of individual clinical expertise and patient choice with the best available external clinical evidence from systematic research
Not ‘cookbook’ medicine

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Give an example where clinicians have persisted in using ineffective interventions

A

Prophylactic use of lidocaine during MI was shown to be more harmful than placebo
Anti-arrhythmic drugs estimated to have caused 20000-70000 deaths per year in USA

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Give an example of clinicians not doing things that are effective

A

Treatment of eclamptic seizures with MgSO4 - used successfully in USA for 60 years
Benefit clearly demonstrated
But by 1992 only 2% of UK clinicians were using it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Who set out the principles of EBP?

A

Archie Cochrane

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Give an example that demonstrates why systematic reviews are useful

A

Corticosteroid treatment for women at risk of giving birth prematurely versus placebo - 7 RCTs
1972- first RCT published showing likely benefit
1979 - seven RCTs published - meta analysis would have shown benefit
1989- systematic review published showing reduction in likelihood of death of baby of 30-50%
Tens of thousands of babies suffered, needed more expensive treatment or died due to the long time period between the first RCT and the systematic review

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Why are systematic reviews needed?

A

Traditional “narrative” literature reviews may be biased and subjective
Not easy to see how they identified studies for review
Quality of studies reviewed variable and sometimes poor
Systematic reviews are useful - can help address clinical uncertainty
Can also highlight gaps in research or show poor quality research

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Why are systematic reviews useful to clinicians?

A

By appraising and integrating findings they offer both quality control and increased certainty
Offer authoritative/reliable, generalisable and up-to-date conclusions
Save clinicians from having to locate and appraise the studies for themselves
May reduce the delay between research discoveries and implementation
Help to prevent biased decisions being made (e.g. Tradition, what the clinician thinks is the best)
Can relatively easily be converted into guidelines and recommendations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Should doctors just accept the findings of a systematic review?

A

No they need to be able to appraise them to be satisfied about the quality of their evidence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How can a doctor assess the quality of evidence?

A

Using a critical appraisal tool/instrument

Suggests things to look for and the questions to ask of research articles

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Where can systematic reviews be found?

A

Reputable, per-reviewed journals e.g. The lancet, BMJ
EBP-specific journals - focus on critical appraisal and systematic reviews
Cochrane library
Centre for reviews and dissemination
NIHR dissemination centre
NIHR health technology assessment programme

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What do we mean by practical criticisms of EBM?

A

Critique around whether it is actually possible to use EBP

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are some practical criticisms of EBP?

A

May be impossible to create and maintain systematic reviews across all specialities
May be challenging and expensive to disseminate and implement findings
RCTs not always feasible or necessary/desirable e.g. If something is shown to be effective in practice do we really need to do a RCT on it, ethical considerations- people may not want to join an RCT if know might not get the actual treatment
Choice of outcomes often very biomedical (complex), may limit which interventions are trialled - if not easy to come up with a clear primary outcome for what might be a complex intervention - the trial may never get done and therefore not available to be picked up by nice and funded
Requires good faith on the part of pharmaceutical companies - publication bias - less likely to see results showing something doesnt work or has negative effect - need to know all relevant data thats out there

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What do we mean by philosophical criticisms of EBP?

A

Critique about how desirable EBP is

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What are some philosophical criticisms of EBP?

A

Doesnt align with most doctors modes of reasoning - What looks like it works on population basis and what is appropriate for particular individual
Aggregate population-level outcomes dont mean that an intervention is going to work for an individual - to what extent are you confident that an intervention will work for the individual and how are you going to explain this to them
Potential of EBM to create unreflective rule followers - in an unquestioning, uncritical way - e.g. Cant necessarily apply guidelines based on single morbidity patients to multimorbidity complex patients
Might be understood as a means of legitimising rationing –> undermine trust in doctor-patient relationship and ultimately the NHS
Dont have a good handle on when professional responsibility/autonomy should take precedence over what the evidence says

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What are some of the problems with getting evidence into practice?

A

Doctors dont know about the evidence - dissemination ineffective, doctors not incentivised to keep up-to-date/dont have the time - unintentional non-adherence
Doctors know the evidence but dont use it - habit, organisational culture, professional judgement - intentional non-adherence
Organisational systems cannot support innovation - managers lack clout to change things, dont have enough people with appropriate skills, dont have the time, equipment, drugs
Commissioning decisions reflect different priorities - resource allocation, if patient wants something else how do you decide whether patient’s choice is acceptable
Resources not available - money, people, clinic time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Why do we need social research?

A

So we can be more confident in answering questions about social life e.g. If women were told about the dangers of smoking during pregnancy would they stop smoking?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Why do doctors need to know more about social research methods?

A

Policies and practices are based on social research e.g. NICE
Doctors have a responsibility to assess, appraise and use this research
Need to integrate and critically evaluate multiple resources

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What are the two main methods of social research?

A

Qualitative

Quantitative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What is quantitative research?

A

Collection of numerical data
Begins with an idea/hypothesis
By deduction allows conclusion to be drawn
Strengths are reliability and repeatability

22
Q

Which research designs are considered quantitative?

A
Experimental (RCT)
Case control
Cohort
Cross-sectional surveys
Secondary analysis of data from:
-official statistics e.g. Census
- other national surveys e.g. Charities, universities, think tanks, polls
- local and regional surveys e.g. NHS, universities, local councils
23
Q

What is a very common method utilised in quantitative research?

A

Questionnaires

24
Q

Give examples of applications questionnaires can be applied to

A

Measure of exposure to risk factors, effect of lifestyle and dietary factors on cancer
Knowledge and attitudes
Satisfaction with health service

25
Q

What two things should a questionnaire be?

A

Valid - measure what it is supposed to measure
Reliable - measure things consistently - differences come from differences between participants not differences in how things were understood or interpreted

26
Q

What is the difference between unpublished and published questionnaires?

A

Published may have been tested for validity and reliability

Unpublished are developed in specific contexts and so need to be tested for validity and reliability

27
Q

What types of questions should be included in a questionnaire

A

Mainly closed questions
Offer “other -please specify” option
Can have open ended questions - will require instructions, take longer to complete, be more difficult to analyse (coding, more qualitative)

28
Q

How can questionnaires be performed?

A

Paper
Over the phone, internet, interviewer - if you have more than one person asking the questions how do you ensure that all questions are asked in the same way so as not to influence the answers

29
Q

What are quantitative methods good at?

A

Describing
Measuring
Finding relationships between things
Allowing comparisons

30
Q

What are quantitative methods bad at?

A

May force people into inappropriate categories - therefore undermining the quality of the data
Doesnt allow people to express things in the way that they want - e.g. For personal questions
May not access all important information
May not be effective in establishing causality

31
Q

What is the aim of qualitative research?

A

To make sense of phenomena in terms of meanings people bring to them
Understanding peoples perspective
Emphasises meaning, views and experience of respondents
Analysis emphasises researchers interpretation not measurement
Provide insight into peoples behaviour
E.g.
Why dont people give up smoking
What is it like to live with rheumatoid arthritis

32
Q

What methods are described as being qualitative?

A

Observation and ethnography
Interviews
Focus groups
Documentary and media analysis

33
Q

What is ethnography and what does it involve?

A

The study of human behaviour in its natural context

34
Q

What is the advantage of ethnography?

A

Can observe what people actually do rather than relying on what they tell you - allows you to record things that people might not tell you due to being biased, unaware of it or feeling it isn’t worth commenting on - can also see the context in which it occurs

35
Q

What are two forms of ethnographic study?

A

Participant observation - observer is involved in the happenings e.g. Clinician observing a ward that they work on
Non-participant observation - observer in the background - unobtrusive

36
Q

What is a disadvantage of observation?

A

Labour intensive

37
Q

Describe interviews as a qualitative method

A

Semi-structured
Prompt guide
Clear agenda of topics but not followed rigidly
May seem conversational
Emphasis on participants giving their perspective - interviewer should facilitate this

38
Q

What is a disadvantage of interviews?

A

Time consuming

39
Q

Describe focus groups as a qualitative method

A

A group of people are asked about their perceptions, opinions, beliefs and attitudes towards a concept/idea/product

40
Q

What are the advantages of focus groups?

A

Quick for establishing parameters or assessing group-based, collective understanding of an issue
May encourage people to participate - if they feel supported by other group members

41
Q

What are the disadvantages of focus groups?

A

Not useful for individual experience
Some topics too sensitive
Deviant views may be inhibited
Difficult to arrange - have to consider power dynamics of group, homogeneity, good facilitator that can encourage quiet people to talk and control loud people

42
Q

Describe documentary and media analysis as a qualitative method

A

Independent evidence e.g. Medical records, patient diaries

Television, newspaper and media coverage

43
Q

What are the advantages of documentary and media analysis?

A

May provide historical context

Useful for subjects that are difficult to investigate

44
Q

What are the disadvantages of documentary and media analysis?

A

They are artful reconstruction of events they describe - e.g. The author decided what they wrote and what they didnt write

45
Q

How is qualitative data analysed?

A

Iterative process - labour intensive
Inductive approach e.g. Theory emerges from the data rather than testing a specific theory
Have to read and reread data
Try to identify themes
Produce specification for themes - codes
Assign data to themes
Constantly compare data analysis against the themes

46
Q

What are qualitative methods good for?

A

Understanding perspective
Accessing information not revealed by quantitative methods
Explaining relationships between variables

47
Q

What are qualitative methods bad for?

A

Finding consistent relationships between variables

Generalisability - sample size small - may not be representative of entire population

48
Q

How can we assess the quality of qualitative data?

A

CASP - rigor, credibility, relevance
Transparency around sampling,methods and analysis is key
Good qualitative research leaves an audit trail - e.g. How things were done, who did it etc.

49
Q

Compare and contrast qualitative and quantitative methods

A
Quantitative:
Numbers
Point of view of researcher - hypothesis, design of the study
Researcher distant
Theory testing (deductive)
Static
Structured
Generalisation (larger sample)
Macro
Behaviour
Causality?
Qualitative:
Words and artefacts
Points of view of participants
Researcher close
Theory emergent (inductive)
Process
Less structured
Contextual understanding
Micro
Meaning
Causality?
50
Q

What factors affect the choice of approach and study design

A
  1. The topic and research question
  2. Expertise of the team/ preferences
  3. Funding and time available
  4. Founders and/or audience
    Different methods can be used in same study, especially if complementary