Evaluation Flashcards
Most commonly cited reasons not to evaluate
lack of budget and lack of time
How to evaluate with limited budget
piggyback studies, secondary analysis, quick tab polls, internet surveys or intercept interviews
Practitioners’ readiness list for evaluation process?
- Understand comms, media effect theory and audience effects
- Understand difference between outputs and outcomes
- Articulate SMART objectives
- Be numeric as well as rhetorical
What are basic questions of evaluation?
- What is extent and distribution of the target problem
- Does the program conform with intended goals?
- What are projected or existing costs?
- Is the program reaching target populations?
- Are intervention efforts being conducted?
- Is the program effective in achieving goals?
- Can the results be explained by an alternate process?
- Is the program having unintended impacts?
- What are the costs?
- Is the program using resources efficiently?
Evaluation Research Steps
- Establish agreement on uses, purposes of evaluation
- Secure commitment to evaluate
- Develop consensus on using eval research
- Write objectives in measurable terms
- Select most appropriate criteria for evalution
- Determine best way to gather evidence
- Keep program records
- Use eval findings to manage the program
- Report results to management
- Add to professional knowledge
Watson’s Unified Evaluation Model
Four stages - inputs, outputs, impact and effects
Elements of evaluation
- Preparation/inputs
- Implementation/outputs
- impact/outcomes/effects
Preparation evaluation
Assesses the quality and adequacy of info used to develop strategy and tactics
Implementation evaluation
Monitors effort and progress as the program unfolds
Impact evaluation
Documents the consequences of the program and feedback on extent to which objectives and goals were achieved
Preparation Criteria and Methods
- Information Base - adequacy of the background information
- Program Content - organization and appropriateness of program and message content
- Presentation Quality - packaging of info; technical and production values
Readability and listenability
The approximate ease with which printed material can be read and comprehended
ELF - Easy Listening Formula - correlate with readability scores
Gunning Fog Index
A method used for measuring readability. Measures difficulty based on average sentence length and the percentage of words with three or more syllables
Implementation Criteria and Models
Cannot be substituted for program impact;
1. Distribution - number of messages distributed
2. Placement - number of messages placed in media and content analysis
3. Potential audience - the number of people exposed to messages
4. Attentive Audience - number of people who attend to messages or attend events. Readership, listenership, viewership.
Equivalent Advertising Value
Calculates how much money an organization would have to have paid to secure the same space or time in the media
Calculation is flawed and misleading
Fallacy of AVE
- Publicity can be irrelevant or in low-priority media
- Publicity can be neutral or negative
- Publicity can contain coverage of competitors
- Publicity can be poorly positioned or poorly presented
- Calculations based on casual advertising rates
- Calculations measure cost, not value
Content Analysis elements
- Place or position
- Prominence
- Share of voice
- Issues or topics
- Messages
- Visuals
Impact criteria and models
Document the extent to which outcomes were achieved
Formative - research findings to start
Intermediate impact - during the program
Summative impact - after the program
What is knowledge gain?
The number of people who learn message content; measuring knowledge, awareness and understanding
What is opinion change?
The number of people who change or form opinion
Criticizing to praising
Negative to positive mentions
Arguing to agreeing
What is attitude change?
Number of people who change or form attitudes
Higher-order program impact
Less subject to short-term change
What is behavior change?
Number of people who act in the desired fashion
Assessments include surveys, direct observation (attendance at meetings, events), indirect observation (agency records, library checkout records, by-products of behavior)
Social media measures are no different
Repeated behavior - # of people who continue or sustain desired behavior
What is ethnography?
observing people in their natural habitats
What is social and cultural change?
Ultimate summative evaluation is contribution to positive social and cultural change
What is data reduction?
Distill data for meaning.
Necessary whenever large amounts of data are collected
What is data display?
Displaying data by main categories, groupings, statistics,
Vital for helping researchers and practitioners interpret data
End point of evaluation
Learning what worked and what has not and if not, why not.
NOT DATA
When evaluation fails…
- The theory behind the strategy was faulty
- Program errors were made when prepping the program
- Evaluation didn’t detect program impact
The benchmark model
Evaluation research can tell practitioners where they started, and where they want to end, and how best to get there.