Chapter 3 - Reviewing Literature Flashcards
What is Peer Review?
- Peer review is the evaluation of a paper by experts in that field
- No guarantee a published paper is trustworthy or worthwhile
What are the common flaws seen in published papers?
- lack of originality
- wrong design choice
- inadequate sample size
- unjustified conclusion
- conflict of interest
- poor writing
How to assess the quality of a paper?
- results (what did the study find?)
- validity (do the results match the conclusions?)
- applicability (will they help you with your own clients or patients?)
Define the IMRAD format
Introduction:
- why the authors decided to do this particular piece of research
Methods:
- how the did it, and how they chose to analyse their results
Results:
- what they found
Discussion:
- what they think the results mean
What general evaluation questions do you need to ask for all papers?
- Who wrote the paper?
- Is the title appropriate and illustrative? Is the abstract informative?
- What was the research design? Was it appropriate to the question?
- What was the research question, and why was the study needed?
- Do the results answer the question?
What are the major aspects you need to consider when reading an original (primary) research paper
- the sample/participants
- the setting
- how data is collected
- how it is analyzed
Define Sample and Setting
- A strong sample is vital for a successful research project.
- Participants in a study may differ from real-life patients (age, gender, co-morbidities etc.) so it is important the sample is close enough to your own patients or clients to make the results applicable.
Sample and Setting - What questions should you ask?
- Who was included in and excluded from the study?
- How were the participants recruited?
- Did the participants receive any special care apart from any intervention that is the focus of the study?
- Where did the study take place?
- Did the study have ethical clearance?
Define Data Collection
Research design will dictate the way data should be collected (e.g. specific tools or instruments, or focus groups
Data Collection - What question should you ask?
- What data was collected?
- Would this be sufficient to answer the question?
- Who collected it and how? If a tool was used, was it validated and appropriate to the design?
- What outcome was measured or explored?
- How was bias avoided or minimised?
- How did the researchers strengthen the validity/reliability or credibility/dependability of their study?
Define Quantitative Data
Numbers and produced through statistical analysis, whether descriptive or inferential
Quantitative Data - What question should you ask?
- Was the study large enough, and continued for long enough, and was follow‐up complete enough, to make the results credible?- Was assessment ‘blind’ or ‘masked’ to avoid performance bias?
- Are the results statistically or clinically significant?
Define Qualitative Data
Analysing qualitative data is more subjective than analysing quantitative studies. You are looking for enough description of the analysis process to convince you the research is credible.
Qualitative Data - What question should you ask?
- Has the phenomenon been described fully?
- Has the researcher examined their own assumptions/role in the research?
- Have the themes, structures or processes been explained, and are they clear and convincing? Has more than one person been involved in the coding process, to strengthen the analysis?
- Have validation techniques been used, such as triangulation, member checking, etc.?
- Are they supported by evidence in the form of participant quotations?-
What is systematic review?
- A systematic review is an overview of primary studies
- For EBP, the most important and useful review is the systematic review, with or without meta‐analysis