Research Design Flashcards
Research
- Investigation through scientific method to establish facts
- Based on a hypothesis & intends to be generalised
- Controlled internally, relies on validity
Evaluation
- Use of a framework to determine value of a program/process
- Intent to improve and make recommendations
- Controlled externally, relies on feasability to determine value
Scientific method
Systematic approach in research to identify problems, collect and analuse data, and develop theory
Evidence based practice
Integration of best available evidence into practice to improve patient care, build credibility and accountability
Research paradigm
- Philosophical model/framework to guide research questions, methods, data collection and analysis
Components of research paradigms
- Ontology: study of existance, provides world view to guide study
- Epistemiology: study of knowledge, provides focus
- Methodology: framework for conducting study
3 Philosophical paradigms
Positivism
Interpretivism/constructivist
Critical approach
Positivism
- Explain truth through scientific method to assess for causal relationships (quantitative)
- Deductive: theory –> conclusion
- Reductionism/determinism: does not occur due to chance
- Examples: descriptive (cross sectional etc), RCT
- Clear, quick analysis, generalisable, high rigour
- High cost, researcher bias, limited probing
Interpretivism/constructivist
- Descriptive, explores meaning
- Inductive: observation –> concepts/meaning
- Subjective: researcher interpretation, value in dialogue and social constructs
- Examples: phenomenology, descriptive, ethnography, grounded theory
- Low cost, complex phenomena, member checking
- Researcher bias, lack of generalisability, biased subjects, lack of research clarity
Critical approach
- Focus on society to critique and challenge power dynamics
- Goal is to encourage equality, change social structures
- Examples: emancipatory research ( benefit to disadvantaged), action research, feminist research
Quantitative design
- Positivist
- Control: use of comparison group to eliminate extraneous variables and threats to IV such as history, maturation and selection
- Randomisation: create similar groups to ensure changes are due to intervention
- Manipulation & blinding
= QT has atleast 1, best to have 3
Types of quantitative designs
Experimental and non-experimental
Experimental
- Manipulation of the IV to observe the effect on the DV
- Limits confounding factors, establishes causality BUT required extensive review & prep, cost
Types of experimental designs
- RCT: causality through control, randomisation & manipulation (high IV)
- Quasi-experimental: manipulation but lacks either/both control & randomisation (weak causality)
Non-experimental/observational
- No IV manipulation, not establishing causality only exploring relationships between variables
- Low evidence/IV, high bias
Types of non-experimental designs
- Observational: explores relationships between variables when little is known
- Descriptive: measures variables of interest
- Cross sectional: frequency and characteristics of x in a population at a point in time
- Cohort studies: disease free population studies over time, with exposed and unexposed groups compared (prospective - defines sample & measures beforehand - or retrospective)
- Case-control: retrospective look back for explanatory factors to link exposure to outcome (compare cases & controls)
Types of qualitative designs
Descriptive
Phenomenology
Ethnography
Grounded theory
Action research
systematic reviews
Descriptive (QL)
- Summary of events/experience with no theory/methodology
- Small data set, thematic analysis
Phenomenology
- explores thoughts, feelings & behaviours to understand meaning
- No causal inferences
Ethnography
- Study of culture from perspective of subject (emic), occurs in the field (etic)
- Tradition: single unfamiliar setting over time
- Focused: pre-identified topic with subcultural groups
- Auto: study of own culture
Grounded theory
- inductively derived grounded theory about a phenomena based on collected data
Action research
- Research at the same time as action (change and improvement)
Systematic reviews
- Critical assessment and evaluation of research studies about a particular topic
Independent vs dependant variable
- I: factor influencing the outcome
- D: result or outcome being studied
Internal validity
Accuracy: whether the outcome is attributed to the cause (strength of causal relationship)
External validity
Extent to which the findings can be generalised beyond the sample
Construct validity
- Association between data and prediction of theoretical trait (construct)
Face validity
- whether it appears to measure what its supposed to (subjective measure)
Content validity
- Whether the measured content covers the full domain (subjective but relies on experts)
Sources of error in internal validity
- **History: **events during the study that affect the DV
- Maturation: changes within a person that affect the DV
- Selection: poor selection that results in a non-representative sample
- Instrumentation: measurement or observation inconsistencies
- **Testing effect **
- Mortality: people dropping out
- Participant reaction bias
- Experimenter bias
- **Confounding variables **
Reliability
- How accurate or trustworthy the results are (are they consistent and reproducible)
Measures of reliability
- Test-retest method: assess the correlation between use of the same instrument on the same sample
- Internal consistency: use of correlation coefficients to assess correlation between different items to measure the same thing
- Alternative forms: 2 similar forms of a test to same population to eliminate practice effects
Sources of reliability error
- true experimental variability: real differences
- Random fluctuations: mood, noise, fatigue
- Systematic error: confounding variables (subjective influence)
- Inter-observer error
Factors influencing reliability
- Length of test
- Objectivity of the assessment
- Method of estimating reliability
Rigour
4 components
- reliability, validity and trustworthiness of research
- Credibility/IV
- Transferability/EV
- Dependability/reliability
- Confirmability/objectivity
How to ensure credibility/IV
- Selection
- Triangulation
- Extensive data collection
- Member checking, reflection
How to ensure transferability/EV
- Appropriate sampling & description of sample and settings
- Strong IV
How to ensure dependability/reliability
- Audit trail
- External audit
- Instrument consistency assessment
How to ensure confirmability/objectivity
Strategies to limit bias
* Audit/peer review
* Triangulation
* Member checking & relfection
NHMRC levels of evidence
- Level 1: SR of RCT
- Level 2: RCT
- Level 3: cohort study, case-control
- Level 4: cross sectional, pre-post test