11 Evaluation Flashcards

1
Q

What is evaluation?

A

Systematic process which assess the value or worth of something with the intent of action being taken.

Evaluation is a systematic process that determines the merit or worth of programmes, products, processes, personnel, and/or policies.

“The application of systematic methods to address questions about program operations and results” (Public Health Ontario, 2012).

“The systematic examination and assessment of features of a programme or other
intervention in order to produce knowledge that different stakeholders can use for a variety of purposes” (Rootman et al, 2001, p. 26)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Research seeks to prove, evaluations seek to improve. True or False

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Difference between research and evaluation

A

Research is usually designed to provide results that go beyond an individual program or project and can be generalised to other populations, conditions, or times. This places additional requirements on research.

Evaluation on the other hand, usually focuses on a situation, such as collecting data about specific programmes, with no intent to generalise the results to other settings and situations.

In other words, research generalises, and evaluation particularises.

Both use the same methods, techniques and analysis to answer the questions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How does research differ evaluation in terms of planning?

A

Research: Scientific method

Evaluation: Framework for program evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How does research differ from evaluation in terms of decision making?

A

Research: investigator controlled
Evaluation: stakeholder controlled (collaborative)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How does research differ evaluation in terms of standards?

A

Research:
Validity (quantitative)
Internal (accuracy, precision).
External (generalisability of results).

Evaluation
Program evaluation standards:
Utility - Discusses use, usefulness, influence, and misuse. The evaluation should be guided by the information needs of its users.
Feasibility - Discusses the effects of contexts, cultures, costs, politics, power, available resources, and other factors on evaluation. The evaluation should be carried out in a realistic, thoughtful, tactful and cost-effective manner.
Propriety - Discusses the moral, ethical, and legal concerns related to evaluation quality. The evaluation should be conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation as well as those affected by its results.
Accuracy - Discusses reliability, validity, and reduction of error and bias. The evaluation could reveal and convey technically adequate information about the features that determine the value of the programme being evaluated.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How does research differ evaluation in terms of questions?

A

Reserach
Facts
*Descriptions.
*Associations.
*Effects.

Evaluation:
Values
Merit (i.e., quality).
Worth (i.e., value).
Significance (i.e., importance).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

The thing being evaluated (program, product, process, personnel, or policy) is referred to as what?

A

the evaluand.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Skills needed for conducting an evaluation

A

Communication skills
Team skills
Organisational skills
Interpersonal skills
Knowledge of research methods e.g. qualitative or quantitative
Know how to search for unintended and side effects
How to determine values within different points of view
How to deal with controversial issues and values
How to synthesise facts and values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Why should we evaluate programs?

A

For learning:
To provide information about the program
To compare different program types
To improve a program.

For accountability:
To measure the program’s effectiveness or contribution
To demonstrate the program’s value
To meet funding requirements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Ethics in evaluation

A

Guiding Principles have been developed by the American Evaluation Association (AEA) to provide direction to the professional evaluator on ethical and appropriate ways to conduct an evaluation (Fitzpatrick et al, 2011).

The Guiding Principles are grouped under five broad headings of: systematic inquiry; competence; integrity/honesty; respect for people; and responsibilities for general and public welfare.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Standards in evaluation

A

Program Evaluation Standards have been developed by the Joint Committee on Standards for Educational Evaluation (US)

Include 30 standards, grouped under five key areas of (1) utility, (2) feasibility, (3) propriety, (4) accuracy and (5) evaluation accountability, to assist evaluators and consumers in judging the quality of an evaluation (Fitzpatrick et al, 2011).

Evaluation Accountability Standards:The evaluation accountability standards encourage adequate documentation of evaluations and a metaevaluative perspective focused on improvement and accountability for evaluation processes and products

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Internal validation

A

Resides in organisation

+ves
More access to to data and resources
Knows program and setting well

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

External validation….

A

Consultants evaluate, mostly initiated by program founders.
+ves - experts, objective

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are the 3 types of evaluations?

A

Needs assessment (aka formative assessment)
Process evaluation
Outcome/Impact evaluation (aka summative)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Describe a needs assessment

A

Purpose: To assess the needs, determine the concept and design of a program.

Project has not been implemented. Proposal may exist.

Questions answered:
Is a project/response needed?
Who needs the response?
How should it be carried out?

17
Q

Describe a process evaluation

A

Purpose: To impove the operation of an exisitng program by identifying strengths and challenges.

Occurs when implementing the program

Questions answered:
To what extent are planned activities actually realised?
How well are the services provided?

18
Q

Describe outcome/impact evaluation

A

Purpose: To assess the short and long term outcomes of a program.

Occurs at end of project or at a specified point in time.

Questions answered:
What short and long term outcomes are observed?
What do the outcomes mean?
Does the project make a difference?

19
Q

Formative vs summative in evaluation

A

In evaluation, formative and summative refer to the decisions or judgements made with the results of the evaluation.

Formative – the focus of the evaluation is on providing information for program improvement (often done internally within an organisation)

Summative – provides information to assist in considering a program’s adoption, continuation or expansion.

Summative evaluations are typically commissioned by the funder of a program to inform decision-making around its continued funding.

Both are essential, during the developmental stages of a program (usually formative), and once the program has stabilised to judge final worth and future (usually summative).

20
Q

Health promotion evaluation therefore has three specific key elements/principles:

A

Participatory – should involve the community as much as possible.

Evaluation should be introduced early, and in all stages of a health promotion program/initiative.

Findings should be conveyed as much as possible to all stakeholders in meaningful ways.

21
Q

What are two frameworks that can be used when undertaking an evaluation?

A

Public Health Ontario Evaluation Framework
Government of Western Australia Research and Evaluation Framework

22
Q

10 steps to evaluating a health promotion program (from Public Health Ontario, 2012)

Step 1 – Clarify the program

A

Ensure your program has clearly defined goal(s), populations of interest, outcomes, strategies, activities, outputs, and indicators.
This is often accomplished using a logic model/program logic.
A logic model is a diagrammatic representation of a program.
Logic models are very common in evaluation.
The logic model follows a logical sequence of what your program is trying to achieve. It shows the relationships among:
WHAT: what does the program do? (inputs, activities, outputs)
WHO: who are the recipients of the program?
WHY: what outcomes are to be achieved?

[You will need a summary of your program that includes:
1. program goal(s)
2. population(s) of interest
3. outcome objectives
4. strategies, activities, and assigned resources
5. process objectives or outputs]

23
Q

What are the four levels of stakeholder involvement in evaluations?

A

Core: closely involved in the program, or will be closely linked to the implementation of the
evaluation. Examples: the program lead or the evaluator.

Involved: will be frequently consulted about the evaluation, or part of the planning
process. Examples: program staff who may collect data from program participants, decision makers who will use the evaluation findings, or program participants.

Supportive: provide some form of support for the evaluation, such as facilitating access to
data or sharing their expertise on evaluation methods.

Peripheral: need to be kept informed. Example: the organization lead.

24
Q

10 steps to evaluating a health promotion program (from Public Health Ontario, 2012)

Step 2: Engage stakeholders

A

Define your stakeholders, understand their interests and expectations, and engage them in a review of objectives.

What do they want to know from the evaluation?

How do they expect the evaluation to be conducted?

This will help you develop your evaluation questions.

Core - Involved - Supportive - Peripheral
Private sector - Government sector - Health related sector - Non-health services sector - Community / grass roots sector

25
Q

10 steps to evaluating a health promotion program (from Public Health Ontario, 2012)

Step 3: Assess resources

A

Clarify staff time, money and other resources available for evaluation. This will inform your evaluation design decisions.

Consider:
Funds
Staff and volunteer time and interests
Timeline
Equipment and tools
Support of partners

26
Q

10 steps to evaluating a health promotion program (from Public Health Ontario, 2012)

Step 4: Organise and select evaluation questions

A

Main tasks:
1. select and refine evaluation questions.
2. determine appropriate evaluation approaches (e.g. process or outcome)

Select your key evaluation questions:
Think about:
Stage of your program
What type of decisions need to be made with the evaluation data
Stakeholder interests
Resources

Example questions

Process evaluation
Was the program carried out as designed?
Did we reach the desired target group?
How many participants attended each session?
What were the implementation facilitators and barriers?

Outcome or impact evaluation
Has there been an increase in physical activity levels?
Are participants more ready to quit smoking?
Have participants increased their knowledge of healthy eating?

27
Q

10 steps to evaluating a health promotion program (from Public Health Ontario, 2012)

Step 5: Determine methods of measurement and procedures

A

What will you measure? – indicators

When will you collect data during your intervention/program? – before/after/both

How will you collect data? - qualitative/quantitative/both

Who will you collect data from? – which groups of people/sub-populations.

Indicator examples:
If your program outcome is:
By the end of the first year of the program, 80% of participating parents will have increased access to affordable, nutritious food through participation in the community kitchen program.
Your outcome indicator/s may be:
% of participating parents who agree that they have increased access to affordable, nutritious food
# # of participating parents reporting decreased reliance on local food bank

(An indicator is a marker. It can be compared to a road sign which shows whether you are on the right road, how far you have travelled and how far you have to travel to reach your destination. Indicators show progress and help measure change)

28
Q

There are three main considerations for selecting indicators:

A

Validity – How well does the indicator actually measure what it should? (e.g. self reported weight vs. actual weight)

Reliability – will it give consistent measurement over time?
If you ask the same question at different times, will they respond in the same way?
Does everyone understand the question in the same way? Is there too much room for interpreting the question?
Can emotions or other circumstances change respondents’ answers from day to day?

Accessibility - What are the barriers to obtaining data on your outcome indicators?

Examples:
There is a limited sample of parents willing to complete the survey

29
Q

Ethical issues still need to be considered. For example:

A

Informed consent
Purpose/goal/objectives of the evaluation
Evaluation method
Potential risks
Potential benefits
Anonymity, confidentiality and security of data.
How the participants will be informed of the results.

30
Q

10 steps to evaluating a health promotion program (from Public Health Ontario, 2012)

Step 6: Develop evaluation plan

A

Identify specific tasks, roles, resource allocations and deadlines for the evaluation
Your plan may include: evaluation questions, if relevant your program logic model, the indicators, methods, data sources, timelines, roles and responsibilities, and how data will be analysed.

31
Q

10 steps to evaluating a health promotion program (from Public Health Ontario, 2012)

Step 7: Collect data

A

Collect credible evidence to answer each evaluation question.
Results and recommendations depend on data quality.

Develop data collection tools (survey, interview guide, etc.) and procedures.
Pilot test tools to ensure validity.

32
Q

10 steps to evaluating a health promotion program (from Public Health Ontario, 2012)

Step 8 – Process data and analyse results

A

Purpose to enter data, check quality and consistency and analyse data to identify evaluation results

Employ your data analysis strategies here (e.g. coding, quantitative analysis, thematic analysis).

33
Q

10 steps to evaluating a health promotion program (from Public Health Ontario, 2012)

Step 9 – Interpret and disseminate results

A

Needs to be evaluated to the original evaluation question

Makes recommendation actions to address outcomes and use information to create material to communicate findings

Provide actions and develop action plan as a group

Interpret and share your findings, engage stakeholders so that they can help identify recommendations.
Presentation of findings can be through written reports, PowerPoint presentations, videos etc.
Make results widely available.

34
Q

10 steps to evaluating a health promotion program (from Public Health Ontario, 2012)

Step 10 - Apply evaluation findings

A
35
Q

Consider stakeholders from what sector?

A

Private sector
Government sector
Health-related sector
Non-health services sector
Community / grass roots sector