Evaluation 2 Flashcards
Step 5: Select Measurement Methods and Procedures
Purpose: To determine what to measure, how to measure it, and what data collection procedures to use.
Key Decisions to Make:
What to measure? → Indicators
How to collect data?
When to collect data?
Where/who to collect data from? → Sources
What to Measure – Indicators
Specific, observable, measurable characteristics showing goal/objective achievement
Types:
Outcome indicators: Track progress toward outcome objectives
Process indicators: Track progress toward process objectives
Guided by: Evaluation purpose/questions & objectives
≥1 indicator per objective
Selection factors:
Importance
Accessibility
Reliability
Validity
Relevance (to audience, setting, objectives)
How to Collect the Data
Review existing data:
Records, meeting notes, tracking tools
Talk to people:
Interviews, focus groups, town halls
Obtain written responses:
Questionnaires, quizzes
Observe/monitor/track:
Staff/participant behavior, standardized tools, outcome measures
Method types: Quantitative and/or qualitative
When to Collect the Data
Timing options:
Before, during, after program
Consider:
Process vs outcome evaluation
Specific indicators
Single vs multiple timepoints
Program timeline & internal/external factors
Who/Where to Collect Data From
Sources:
Target population (primary & secondary audiences)
Program partners, staff
Internal records/tracking docs
Secondary sources
Considerations:
Recruitment methods
Sample size
Representation
Data Collection Matrix Includes:
Indicators (linked to objectives)
Data collection methods:
Existing data/documents
Direct conversation
Written responses
Observation/monitoring
Roles/responsibilities
Data sources
Collection timeline
Analysis plan
See diagram
Step 6: Create the Evaluation Plan
Purpose: To document the decisions made in Steps 1-5 in an evaluation plan.
What is the Evaluation Plan?
A written document outlining how the evaluation will be conducted
Includes:
Program description
Purpose of evaluation
Data collection matrix
Budget
Plan for use of results
Ethical concerns + how they’ll be managed (e.g., informed consent)
Appendices may include:
Data collection tools
Timelines
Consent forms, questionnaires, protocols
Ethical Concerns in Evaluation
Informed consent
Confidentiality & data protection
Respect for participants (especially vulnerable populations)
Use of results (transparency & fairness)
See examples
Step 7: Collect the Data
Purpose: To collect the data needed to answer each evaluation question.
Collect the Data
Create standardized procedures
Train data collectors
Pilot test tools/processes
Collect data as planned
- Refine based on ongoing feedback
Step 8: Process and Analyze Data
Purpose: To synthesize and analyze data collected for the evaluation.
Purpose:
Answer evaluation questions
Assess program quality & effectiveness
Steps:
Ensure quality: Check early responses, missing/invalid data, accuracy
Organize data: Prep variables (totals, categories, etc.)
Analyze data:
Qualitative: Content analysis
Quantitative:
Descriptive:
Categorical: Frequency
Continuous: Mean, Median, Range, SD
Inferential: t-test, chi-square, correlation, ANOVA, regression
Synthesize data:
Summarize & present
Interpret with partners
Compare with norms, population data, other programs
Check representativeness
Evaluation Plan – Examples
Process Objectives:
PO12: Falls Program
Participants – Quantitative → Frequency
Feedback – Qualitative → Content analysis
PO2: Flyer Distribution
Flyers – Quantitative → Frequency
Satisfaction – Quantitative → Mean
Outcome Objectives:
OO1: ↑ Awareness of fall risks
Survey → t-test
OO3: ↑ Knowledge of home safety checklist
Survey → (likely t-test)
OO7: ↑ Physical activity
Survey/Accelerometers → (trend/repeated measures)
See diagrams