Evaluation Flashcards

1
Q

Evaluation

A

Describing the object of interest so its worth and merit can be judged

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Worth

A

Whether or not a program is needed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Merit

A

Whether or not a program is good

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

5 things that evaluations help planners with

A

Make decisions based on systematically collected info
See if goals and objectives have been met
See if it is being implemented as intended
See if changes need to be made along the way
Develop cost effective strategies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

2 critical purposes of evaluation

A

Assessing and improving quality

Determining program effectiveness

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

6 reasons stakeholders want evaluation

A
Determine achievement of objectives
Improve implementation
Make people accountable for their roles
Raise community support
Add to scientific knowledge and literature
Inform policy decisions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

When should the evaluation begin

A

When goals and objectives are developed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Who does the evaluation

A

Collaborative effort among stakeholders

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

5 threats to good evaluations

A
Fear of cuts if results arent good
Political skewing
Not reporting appropriate data
Political ramifications of truthful events
Implications for agencies
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Evaluation during initial plannin

A

Needs assessment with market evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Evaluation during development

A

Formative evaluation with market testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Formative evaluation

A

Save money by figuring out what people will actually use

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Evaluation during early implementation

A

Implementation evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Implementation evaluation

A

Extent to which program conforms to original plan

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Evaluation during routine operation

A

Process evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Process evaluation

A

Appraisal of program delivery and usage under normal operation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Evaluation during stable operation

A

Outcome evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Outcome evaluation

A

Appraisal of impact on clients in relation to level of participation and baseline characteristics to see if long term objectives were met

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

10 steps in evaluation

A
Clarify your program
Engage stakeholders
Assess resources
Design evaluation
Determine methods of measurement and procedures
Develop workplan, budget and timeline
Data collection
Data analysis
Interpret results
Take action
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

3 issues with asking participants if they benefited from the program

A

Retrospective reporting bias
Assumption that change could only occur with intervention
Social desirability bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Standards of accesibility

A

Minimum levels of effectiveness and benefits used to judge value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Formative Evaluation

A

Relates quality assessment and program improvement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Summative Evaluation

A

Determining effectiveness

24
Q

Difference between formative and process evaluation

A

Formative focuses on improving quality during implementation while process measures degree to which program was successfully implemented

25
2 parts of summative evaluation
Impact evaluation | Outcome evaluation
26
Impact evaluation
Focus on intermediary measures such as change in behaviour or attitude
27
6 steps in CDC framework
``` Engage stakeholders Describe program Focus evaluation design Gather credible evidence Justify conclusions Ensuring use and sharing lessons ```
28
4 CDC standards of evaluation
Utility- information needs of users are satisfied Feasability- Realistic and affordable Propriety- Ethical Accuracy
29
Baseline data
Reflecting initial status of population
30
Designs
Summative evaluation
31
Focus of formative evaulations
Quality of program content and implementation
32
When does formative evaluation occur
From inception of program through implementation
33
Cost identification analysis
Compare different interventions available for a program
34
Cost effectiveness analysis
Quantify effects of a program in monetary terms
35
Multiplicity
Multiple component programs cater more effectively to varied needs
36
Adjustment
Planners make necessary changes based on feedback
37
6 components of process evaluation
``` Fidelity Dose Recruitment Reach Response Context ```
38
Fidelity
Programs are implemented as intended
39
Dose
Number of program units delivered
40
Recruitment
Degree that population is recruited for participation
41
Reach
Proportion of population given opportunity to participate
42
Response
Proportion of the population actually participating
43
Context
External factors that may influence results
44
Pre testing
Testing components of a program and collecting baseline data
45
Quantitative method
Deductive in nature and produces numeric data
46
Qualitative method
Inductive model produces narrative data
47
Mcleroy model 1
Qualitative methods are used to help develop quantitative methods
48
Mcleroy model 2
Qualitative results are used to help interpret a quantitative evaluation
49
Mcleroy model 3
Quantitative results are used to help interpret qualitative results
50
Mcleroy model 4
Qualitative and quantitative are used equally
51
Posttest
Measurement after completion of the program
52
Quasi experimental design
Interpretable and supportive evidence of program effectiveness but can not control confounding factors
53
Non experimental design
Does not use comparison or control groups an has little control over confounding factors
54
Internal validity
Degree to which change that was measured can be attributed to the program
55
External validity
Extent to which program can be expected to produce similar effects in other populations