Evaluation 2 Flashcards

1
Q

Step 5: Select Measurement Methods and Procedures

A

Purpose: To determine what to measure, how to measure it, and what data collection procedures to use.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Key Decisions to Make:

What to measure? → Indicators
How to collect data?
When to collect data?
Where/who to collect data from? → Sources

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What to Measure – Indicators

A

Specific, observable, measurable characteristics showing goal/objective achievement

Types:
Outcome indicators: Track progress toward outcome objectives
Process indicators: Track progress toward process objectives
Guided by: Evaluation purpose/questions & objectives
≥1 indicator per objective
Selection factors:
Importance
Accessibility
Reliability
Validity
Relevance (to audience, setting, objectives)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How to Collect the Data

A

Review existing data:
Records, meeting notes, tracking tools

Talk to people:
Interviews, focus groups, town halls

Obtain written responses:
Questionnaires, quizzes

Observe/monitor/track:
Staff/participant behavior, standardized tools, outcome measures

Method types: Quantitative and/or qualitative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

When to Collect the Data

A

Timing options:
Before, during, after program
Consider:
Process vs outcome evaluation
Specific indicators
Single vs multiple timepoints
Program timeline & internal/external factors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Who/Where to Collect Data From

A

Sources:
Target population (primary & secondary audiences)
Program partners, staff
Internal records/tracking docs
Secondary sources
Considerations:
Recruitment methods
Sample size
Representation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Data Collection Matrix Includes:

A

Indicators (linked to objectives)
Data collection methods:
Existing data/documents
Direct conversation
Written responses
Observation/monitoring
Roles/responsibilities
Data sources
Collection timeline
Analysis plan

See diagram

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Step 6: Create the Evaluation Plan

A

Purpose: To document the decisions made in Steps 1-5 in an evaluation plan.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the Evaluation Plan?

A

A written document outlining how the evaluation will be conducted
Includes:
Program description
Purpose of evaluation
Data collection matrix
Budget
Plan for use of results
Ethical concerns + how they’ll be managed (e.g., informed consent)
Appendices may include:
Data collection tools
Timelines
Consent forms, questionnaires, protocols

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Ethical Concerns in Evaluation

A

Informed consent
Confidentiality & data protection
Respect for participants (especially vulnerable populations)
Use of results (transparency & fairness)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

See examples

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Step 7: Collect the Data

A

Purpose: To collect the data needed to answer each evaluation question.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Collect the Data

A

Create standardized procedures
Train data collectors
Pilot test tools/processes
Collect data as planned
- Refine based on ongoing feedback

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Step 8: Process and Analyze Data

A

Purpose: To synthesize and analyze data collected for the evaluation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Purpose:

Answer evaluation questions
Assess program quality & effectiveness

A

Steps:

Ensure quality: Check early responses, missing/invalid data, accuracy
Organize data: Prep variables (totals, categories, etc.)

Analyze data:
Qualitative: Content analysis

Quantitative:
Descriptive:
Categorical: Frequency
Continuous: Mean, Median, Range, SD
Inferential: t-test, chi-square, correlation, ANOVA, regression

Synthesize data:
Summarize & present
Interpret with partners
Compare with norms, population data, other programs
Check representativeness

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Evaluation Plan – Examples
Process Objectives:

PO12: Falls Program
Participants – Quantitative → Frequency
Feedback – Qualitative → Content analysis
PO2: Flyer Distribution
Flyers – Quantitative → Frequency
Satisfaction – Quantitative → Mean
Outcome Objectives:

OO1: ↑ Awareness of fall risks
Survey → t-test
OO3: ↑ Knowledge of home safety checklist
Survey → (likely t-test)
OO7: ↑ Physical activity
Survey/Accelerometers → (trend/repeated measures)

17
Q

See diagrams