Week 12: Evaluating Research Reports Flashcards
Evidence Base Practice: health care providers
- incorporate research findings into clinical judgement and treatment decisions
- evaluate research reports
- determine if findings provide sufficient evidence to support clinical practices
EBP: critical appraisal
- determine scientific merit of research report
- applicability to clinical decision making
When evaluating any research report what do you examine to determine?
- validity of design and analysis
- meaningfulness of findings
- relevance of results to practice
- not enough to answer yes or no you need rationale or implications of answer to evaluate study’s value
Research validity - explanatory design types
- statistical conclusion validity
- internal validity
- construct validity
- external validity
Statistical conclusion validity
- is there a relationship between the independent and dependent variables
- appropriate use of statistical procedures for analyzing data
internal validity
is there evidence of a causal relationship between IV and DV
construct validity
to what constructs can results be generalized
external validity
can the results be generalized to other persons, settings or times
Threats to statistical conclusion validity: statistical power
- statistical power: ability to document real relationships between IV and DV (low power = may not be able to identify statistical relationship) and often affected by sample size
threats to statistical conclusion validity: violated assumptions
- violated assumptions: most statistical tests based on assumptions if not met = erroneous inferences
Statistical conclusion validity: variance reliability
- statistical conclusions threatened by extraneous factors that increase variability within data
- unreliable measurement
- failure to standardize protocol
- environment interferences
- heterogeneity of subjects
failure to use intention to treat analysis
- should maintain randomization to groups
- analyst according to original assignment
Evaluation process - introduction (what should the introduction have/do)
- establish problem being investigated an dits importance
- demonstrate that researchers have thoroughly synthesized literature
- provide rationale for pursuing this research
- contribution of this research
- purpose aims Or hypotheses
Evolution process: methods:
- participants (target population & accessible population)
- recruitment
- inclusion/exclusion criteria
- sample size: authors should describe how estimated (related to power)
design
- appropriate to answer research question
- control for potential confounding variables
- rationale for choices of interventions and measurements
- adequacy of time frame of study
evaluation process: data collection (methods)
- operational definition of variables
- measurement reliability (assessed within the study/based on prior research)
- measurement validity: prior research
- data collection described clearly