AOR 4 Evaluation & Research Flashcards
Both program evaluations and research require designing plans that include ____________________-
selecting adapting or creating valid & reliable data collection instruments, developing sampling plans, collecting & managing data, analyzing collected data, interpreting results, applying findings & communicating results
What is the primary focus of evaluations? Research?
Evaluations - determining if program objectives have been met/success of program
Research - identifying new knowledge or practices that answer questions/test hypotheses about health-related theory, behavior, or phenomenon
Delimitations & give examples
decisions made by evaluator or researcher that are used to help identify parameters & boundaries set for a study (helps to manage scope of study)
- why some literature is not reviewed
- why populations are not studied
- why certain methods are not used
Evaluation
series of steps that evaluators use to assess a process or program to provide evidence and feedback about the program
Limitations
phenomena evaluator/researcher cannot control that place restrictions on methods & conclusions
- e.g. time, nature of data collection, instruments, sample, analysis
Logic Model
depict aspects of a program to include inputs, outputs, & outcomes
- provide scaled down, somewhat linear, visual depiction of programs
Research
organized process in which a researcher uses the scientific method to generate new knowledge
Reliability
Consistency, dependability, & stability of measurement process
Validity
degree to which a test/assessment measure what it is intended to measure
Unit of Analysis
what or who is being studied or evaluated
Formative Evaluation
Process of assessing quality of a program during planning and implementation
- used by evaluators & researchers
Process Evaluation
measures quality of performance and delivery throughout program implementation
Summative Evaluation
evaluation occurs after program has ended
- Designed to produce data and information on the program’s efficacy or effectiveness during its implementation phase
- Provides data on the extent of the achievement of goals (impact & outcome)
Impact Evaluation
Immediate & observable effects of a program leading to desired outcomes
Outcome Evaluation
assesses whether there is also a demonstrable change in the targeted health outcome
- What changed about the public health problem
- Focuses on ultimate goal, product, or policy
- measured in health status, morbidity, mortality, etc.
Choosing what instrument to use to collect data is based on ______________
goal of data collection, population under investigation, & resources of those trying to collect data
- data collection instruments have both advantages & disadvantages
HES must follow _________________ when planning & conducting evaluation
federal & state laws and regulations, organizational & institutional policies, & professional standards
Data instruments (whether new, existing, adapted) should be tested for ________________
literacy reading level
Why should data collecting instruments be tested?
To ensure validity of responses
Examples of instrument readability tools
SMOG and FleschKincaid
Advantages and Disadvantages of using existing data collection instruments
Advantages: previously tested for reliability, validity, direct comparison measures, reduced cost (compared to creating new instrument), & user familiarity
Disadvantage: potential for unreliable measures given different population demographics & situations
What should HES consider before using an existing data collection instrument?
- if item is appropriate for intended purpose
- if language appropriate for population
- whether test has been performed using sample from intended population
- to whom you should give credit for using instrument
Difference between research and evaluation
Research: conducted with intent to generalize findings
Evaluation: determine if a specific program was effective
The most rigorous evaluation or research design available should be used. However, what are things that could require less rigorous approach?
Ethics, cost, politics, & resources
Considerations of implementation
- find reliable, trustworthy, skilled people to COLLECT, ENTER, ANALYZE, & MANAGE data to ensure quality results
- define roles, responsibilities, & skills needed to collect QUALITATIVE data(focus groups, interviews, etc) which may differ when collecting QUANTITATIVE data (surveys)
- monitor data collection to ensure implementation of process will assist in maintaining established time frames & objectives
- maintain integrity of data collected & ensure protocols address quality control measures (during both collection and entering of data)
Field procedures for data collection
- protocols for scheduling initial contacts with respondents
- Introducing instrument to respondent
- keeping track of individuals contacted
- following up with non-respondents when appropriate
All ata must be carefully ____________ into a useable format
coded & organized
Research and Evaluation can help record ______________ & ____________
what changes have occurred & identify what led to those changes
Advantages & Disadvantages of online surveys
Advantages: cost-effective, versatility, larger potential reach, convenience, anonymity of respondents, risk of data entry errors significantly reduced during transcription,
Disadvantages: lower response rate, language barriers, not feasible for all populations (lack of internet access), risk for multiple response rates
What should data management/analysis plan include?
- procedures for transferring data from instruments to data analysis software
- detail how data will be scored & coded
- how missing data will be managed
- how outliers will be handled
- scoring guide
- Data Screening - assessing accuracy of data entry
What can data screening tell us?
If statistical assumptions are met
Problematic Outliers
not representative of population
Beneficial Outliers
those representative of population
Multivariate Outliers
unusual combinations of scores on different variables
- hard to detect without statistical tests
Missing Data
observations that were intended to be made but were not
What should guide data analysis?
research/evaluation questions & level of measurement of data
Correlates (relationship/connection where something affects or is dependent on another) can be derived through ______________________
interpretation of data, reach & effectiveness or size of effect
HES need to incorporate ___________________ to their evaluation & research findings & __________________ into decision making, policy development, & implementation of programs
evidence based practice approach; scientific evidence
What does evidence-based practice approach look like?
- best available scientific evidence (from literature) & data are combined
- program planning frameworks are employed
- engage the community
- programmatic evaluation is used
- results are disseminated
How doe HES plan for future programs/interventions?
interpretation of evidence to determine significance & draw relevant inferences
- data interpretation is stronger when including key stakeholders and getting community perspective during evaluation process
Findings from a study must be analyzed based on specific ________________ of the study
Delimitations
e.g . narrowing study by geographic location, time, & population traits
Findings from evaluation & research are subject to systematic error known as ____________
BIAS - from sampling, design, implementation, or analysis
Confounding Variables
extraneous variables outside scope of intervention that can impact the results
- variables that are not accounted for in the study design
What should be done before applying recommendations to program or policies?
stakeholders have opportunity to review & discuss research or evaluation findings