Case 6: Evaluation of policy measures: how to go about it? Flashcards
How can we identify which goals need to be met?
When interventions are developed often times a needs assessment is made to identify what goals need to be met → these goals need to be clearly defined
What research designs & data collection techniques are used to evalute policy?
- RCTs
- Quasi-experimental studies
- Survey studies
- Interviews
- ETC?
What practical issues do policy evaluations need to take into consideration?
- Resource scarcity
- Organisational complexity
- Stakeholder participation,
- Transparency
- Context
- Skill of the researchers involved
What is evaluation?
- Process of analysing policies & context in which they occur to determine whether changes need to be made in implementation
- to assess intended & unintended consequences of programs & policies
What are the methods of evaluation?
- strongest approaches include a mixed methods approach as this provides a comprehensive review of evaluation
there is more info here i think
Why is evaluation relevant?
- forms basis for making choices when resources are limited
- helps in making midcourse corrections & improving programs
What is a policy evaluation?
Assess operation & impact of (public) policies & action programmes introduced
What is the difference between research & evaluation?
Research is a systematic process for generating new knowledge and relating it to existing knowledge in order to improve understanding about the natural and social world → this is different to evaluation even though evaluation uses research methods
What is evidence based policy?
movement within public policy to give evidence greater weight in the shaping of policy decisions → integrates experience, judgement and expertise
What is tool 45 of the EU?
- What is an evaluation
- Principles of objectivity & independence
Explain box 1 (what is an evaluation) of tool 45
Evaluation –> evidence-based judgement of the extent to hwich an exisiting intervention is:
- effective in fulfilling expectations & meeting objectives
- efficient in cost-effectiveness & proportionality of acutal costs to benefits
- relevant to current & emerging needs
- Coherent internally & externally (with other EU interventions/international agreements)
- has EU added value (produces results beyond what would have been achieved by MS acting alone)
What is the aim of evaluation?
aims to draw conclusions about the causal effects of the (EU) intervention on the actual outcome/results
What is relevant about tool 45?
Goes beyond factual assessment of what has happened & considers:
- why something has happened
- how much change can be attributed to EU intervention
- to what extent this change meets original expectations/projections
What is tool 46?
What is important according to tool 46 when designing an evaluation?
- Clarify purpose of evaluation
- Define the scope
- Explain intervention logic
- Draft good evaluation questions
- Identify appropriate points of comparison
- consider appropriate data collection & analytical methods
Explain “clarify the purpose of the evaluation” when designing an evaluation from tool 46
Deciding & clearly describing what the evaluation will deliver & how its findings will be used.
Explain “define the scope” when designing an evaluation
- setting out clearly what will be evaluated.
- Can be in terms of interventions, time period, geographical coverage, particular effects, etc
- Should understand what will be covered by evaluation & what won’t & why.
Explain “explain the intervention logic” when designing an evaluation
- summarising how the intervention was expected to work (i.e. at the time of adoption by the Commission or later by the co-legislators, or at the time of implementation), including the underlying assumptions.
- The intervention logic can draw from any prior impact assessment or other documentation such as the explanatory memorandum, which justified the initial policy action.
Explain “draft good evaluation questions” when designing an evaluation
- They should address the five evaluation criteria and any other aspect as relevant, also considering the feedback on the ‘call for evidence’ to the extent possible.
- Questions should cover all issues that are known to be of interest to the stakeholders.
Explain “consider approriate data collection & analytical methods” when designing an evaluation
it’s important that the evaluation is set up to collect and analyse a range of different data, using the appropriate data and methodologies to fill existing data gaps and to robustly answer the evaluation questions
**What is tool 47?
incomplegte
Instrumental
Conceptual
Symbolic
Explain box 1 of tool 45
- box 1 = What is an evaluation:
- Goes beyond assessment of what’s happened & considers why something has happened, how much change can be attributed to EU intervention & to what extent this change meets original expectations/projections
● Evaluation aims to draw conclusions about the causal effects of the (EU) intervention on the actual outcome/results
Explain box 2 of tool 45
- box 2 = principles of objectivity and independence
- DGs may focus on objectivity and independence