Module 04 Flashcards

Chapter 12 and 13

1
Q

What is a process evaluation?

A

A process evaluation examines how a program operates and delivers its services to clients.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

When should process evaluations occur?

A

Process evaluations should occur before or at the same time as outcome evaluations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the two major categories of program processes?

A
  • Client service delivery system
  • Administrative support systems
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the primary aim of a process evaluation?

A

To monitor a program’s services and assess client satisfaction.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

True or False: Process evaluations focus solely on program outcomes.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the three main purposes of conducting a process evaluation?

A
  • Improving a program’s operations
  • Generating knowledge
  • Estimating cost-efficiency
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How can process evaluations improve program operations?

A

By fine-tuning the services delivered to clients and ensuring administrative support is effective.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does generating knowledge through process evaluations typically inform?

A

Decisions about the further development of the program’s services.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are the eight questions that can focus a process evaluation?

A
  • What is the program’s background?
  • What is the program’s client profile?
  • What is the program’s staff profile?
  • What is the amount of service provided to clients?
  • What are the program’s interventions and activities?
  • What administrative supports are in place?
  • How satisfied are the program’s stakeholders?
  • How efficient is the program?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Fill in the blank: Process evaluations are sometimes referred to as ______.

A

[formative evaluations]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is a common challenge in describing program processes?

A

Achieving a level of precision in using consistent language across different disciplines.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What does monitoring interventions and activities imply?

A

That there are labels and definitions for what is done with clients, which increases communication.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is meant by the term ‘black box’ in relation to client service delivery?

A

It reflects the notion that clients enter and exit a program with no clear understanding of the processes involved.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the significance of a program’s client profile?

A

It informs how the processes within the program are operationalized and monitored.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What type of data is important to gather in a client profile?

A
  • Age
  • Gender
  • Income
  • Education
  • Race
  • Socioeconomic status
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What can the program’s history reveal?

A

Critical insights into its day-to-day operations and the political and social context it operates within.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What role does stakeholder satisfaction play in a process evaluation?

A

It helps assess the effectiveness and reception of the program’s services.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the significance of worker qualifications in a program?

A

Monitoring worker qualifications helps establish minimum-level qualifications for job advertisements and provides insight into the effectiveness of service delivery.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

How do educational backgrounds influence program approaches?

A

Different educational backgrounds can lead to varying philosophical approaches in programs addressing the same social issue, such as teenage pregnancy.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What sociodemographic data is typically used to describe workers?

A

Age, gender, and marital status are commonly used sociodemographic features.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

In what ways can worker time be categorized?

A

Worker time can be divided into face-to-face contact, telephone contact, report writing, advocacy, supervision, and consultation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What is the importance of tracking service intensity?

A

Tracking service intensity helps determine how much time is spent on various client interactions and can guide program decisions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

True or False: Programs always have clear-cut intake and termination dates.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What is a process evaluation?

A

A process evaluation assesses whether actual services delivered match the original design and logic model of a program.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

What are administrative supports in a program?

A

Administrative supports include fixed conditions of employment and operations designed to assist workers in delivering client services.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

What factors can affect worker effectiveness in a program?

A

Worker pay, caseloads, availability of support staff, and working conditions can significantly impact effectiveness.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Fill in the blank: The primary objective of a literacy program visit is to _______.

A

[increase literacy skills of children]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

What are the two main objectives of the Rural Family Literacy Program?

A
  • To increase literacy skills of children
  • To increase parents’ abilities to assist their children in developing literacy skills
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

What is the role of stakeholder satisfaction in process evaluation?

A

Stakeholder satisfaction assesses how well the program meets the needs and expectations of those involved.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

What can be inferred from a program’s efficiency?

A

Efficiency reflects the amount of resources expended versus the achievement of program objectives.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

How can data from home visits influence program decisions?

A

Data can provide insights into the effectiveness of interventions and help adjust strategies to enhance client engagement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

What is one example of how administrative decisions can change service delivery?

A

Switching from group care settings to providing interventions in less intrusive environments, like clients’ homes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

True or False: Individual client success rates should be used to evaluate social workers.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

What data collection method was used by workers in the Rural Family Literacy Program?

A

Workers completed a Daily Family Visit Log to monitor fidelity and effectiveness of their visits.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

What impacts can high caseloads have on social workers?

A

High caseloads can lead to decreased effectiveness in responding to clients’ needs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

What key aspect must be monitored to assess program fidelity?

A

The actual services delivered must align with the original program design and objectives.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

What is the purpose of stakeholder satisfaction in program evaluation?

A

Stakeholder satisfaction assesses the programs services from the stakeholders’ perspective.

It is crucial for understanding how well the program meets the needs of its clients.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

What is a common method for collecting client satisfaction data?

A

Client satisfaction surveys conducted at program termination.

These surveys can provide valuable insights into client experiences and perceptions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

True or False: Client satisfaction data is relevant to outcome evaluations.

A

False.

Client satisfaction is specifically relevant to process evaluations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

What are the four issues to address when developing data-collection instruments for process evaluation?

A
  • Easy to use
  • Flow with a program’s operations
  • Designed with user input
  • Comprehensive data-collection system

These criteria ensure the effectiveness and efficiency of data collection.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

Fill in the blank: Data-collection instruments for process evaluation should be _______.

A

easy to use.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

Why is it important for data-collection instruments to flow with a program’s operations?

A

They need to fit within the context of the program and provide useful data for improving client service delivery.

This alignment ensures that data collection does not disrupt the program’s workflow.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

What should be included in a client intake form?

A
  • Client characteristics
  • Reasons for referral
  • Service history

This information is essential for both case-level and program-level evaluations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
44
Q

What is the role of user input in developing data-collection instruments?

A

User input, especially from line-level workers, ensures the relevance and accuracy of the data collected.

Workers are more likely to record accurate data if they see the relevance of the instruments.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
45
Q

What are the three considerations for developing a data-collection system for process evaluation?

A
  • Determining the number of cases to include
  • Determining times to collect data
  • Selecting a data-collection method

These considerations help in structuring the evaluation process effectively.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
46
Q

How can a program ensure random sampling for client satisfaction surveys?

A

By randomly selecting clients to participate as close to the intake meeting as possible.

This method helps in generalizing results to all clients within the program.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
47
Q

What is a potential issue with lengthy client intake forms?

A

They may become overly detailed and burdensome for both clients and workers.

Short and long forms can be developed to balance the depth of data collected.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
48
Q

What is meant by ‘case-level interventions’ in the context of data collection?

A

Specific actions taken by workers to address individual client needs.

These interventions can be tracked to assess client progress.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
49
Q

What are some examples of data that might be collected at different times during the evaluation process?

A
  • Basic client demographics at intake
  • Sensitive information after rapport is established

Collecting data at appropriate times can improve the quality of the information gathered.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
50
Q

Why is it important to monitor process data for reliability and validity?

A

To ensure that the data collected accurately reflects the program’s operations and client experiences.

Poor data quality can lead to incorrect conclusions about program effectiveness.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
51
Q

Fill in the blank: Data collection should balance _______ and _______.

A

breadth; depth.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
52
Q

What type of data should be collected from all clients for case-level evaluations?

A

Intake forms and assessment data.

These forms are critical for planning client treatment interventions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
53
Q

What is the primary purpose of recording workers’ activities?

A

To document and evaluate worker-client interactions and interventions

This process is often time-consuming and may raise questions about the reliability of self-reported data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
54
Q

What are some fixed client characteristics that can be recorded?

A
  • Race
  • Gender
  • Service history
  • Problem history
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
55
Q

Why might the reliability of data collected by workers be questioned?

A

Because the data is often self-reported by the workers themselves.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
56
Q

What is a recommended practice for supervisors regarding worker-client interactions?

A

Supervisors should observe worker-client interactions to assess the reliability of self-reports.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
57
Q

What is interrater reliability?

A

The extent of agreement between the workers’ perceptions and the supervisors’ perceptions of worker-client interactions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
58
Q

What issue can arise when workers administer client satisfaction questionnaires?

A

Social desirability bias, leading to less honest ratings from clients.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
59
Q

What should be done to ensure client responses to satisfaction questions are confidential?

A

Clients should be informed that their responses are confidential and will be pooled with others’ responses.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
60
Q

What is the importance of summarizing process evaluation data?

A

To ensure that the conclusions drawn from the evaluation are directly based on the gathered data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
61
Q

What can happen if data is collected inconsistently?

A

It will be difficult to summarize and may produce inaccurate information.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
62
Q

What is one potential outcome of a backlog in data summarization?

A

It may indicate that too much data is being collected.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
63
Q

How does the timing of data recording impact the evaluation process?

A

Initial diligence may wane, leading to incomplete data that misrepresents the program’s effectiveness.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
64
Q

What key questions can be asked regarding outcome data collected?

A
  • Is the time spent with a family related to success?
  • What interventions are associated with successful outcomes?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
65
Q

Why is it important to communicate findings from process evaluations?

A

To ensure results are used for program improvement and to foster discussions among workers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
66
Q

What is a program audit sheet?

A

A tool that lists all necessary data to be recorded for each client.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
67
Q

What should be included in program development meetings following a process evaluation?

A

Data summaries and discussions on individual worker feedback.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
68
Q

What is the black box in the context of program evaluations?

A

The unknown aspects of how workers function on a day-to-day basis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
69
Q

Fill in the blank: The final step in a process evaluation is the _______.

A

[dissemination and communication of findings]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
70
Q

What components should a process evaluation consider?

A
  • Preparation
  • Intake
  • Screening
  • Assessment
  • Termination
  • Follow-up
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
71
Q

What type of data can process evaluations provide regarding interventions?

A

Clues about which interventions work with specific client problems.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
72
Q

True or False: Process evaluations only focus on client outcomes.

A

False. They also focus on the inner workings of a program.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
73
Q

What is the role of program administrators in process evaluations?

A

To review client records and ensure necessary data is captured.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
74
Q

What should be done if clients are hesitant to provide honest feedback?

A

Use a neutral person to administer satisfaction questionnaires.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
75
Q

What is the significance of using visual data in evaluations?

A

It provides a forum for discussion and helps illustrate program effectiveness.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
76
Q

What should be tracked to assess client participation in counseling sessions?

A

Attendance and quality of participation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
77
Q

What documents are typically involved in the intake process?

A
  • Screening Form
  • Intake Form
  • Social History
  • Data Entry Forms
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
78
Q

What is the purpose of the Individual Rehabilitation Plan (IRP)?

A

To assist clients in developing a plan to meet individual and program objectives.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
79
Q

What is the purpose of process evaluations?

A

To improve services to clients

Process evaluations focus on collecting data to make informed decisions regarding program operations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
80
Q

What must program staff decide when designing a process evaluation?

A

They must decide on the following:
* Questions to ask
* Data collection methods
* Responsibilities for monitoring data
* Data analysis methods
* Dissemination of results

These decisions are crucial for effective evaluation of program processes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
81
Q

What does IRP stand for?

A

Individual Rehabilitation Plan

An IRP outlines the goals and objectives for an individual in a rehabilitation program.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
82
Q

What is the first step for a client seeking job placement?

A

Client meets with a job placement counselor

This meeting is to identify available job slots that fit with the client’s training.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
83
Q

What document is used for job placement referrals?

A

Job Placement Referral Form

This form is essential for tracking job placement efforts.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
84
Q

What is the objective of the IRP that clients must achieve?

A

Clients must demonstrate readiness to function independently in the community

This involves planning for termination from programs like Safe Haven.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
85
Q

What is involved in the follow-up process after a client achieves their IRP objective?

A

Case manager makes telephone contacts at agreed-on times

Follow-up is critical for ensuring continued support and assessment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
86
Q

What happens during the exit phase for a client?

A

Follow-up contacts end by mutual agreement

This indicates a formal closure of the case.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
87
Q

What forms are used during the client termination process?

A

The following forms are used:
* Victims to Victors Termination Form
* Safe Haven Termination Form
* Data Entry Form

These forms help document the termination process and client outcomes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
88
Q

Fill in the blank: Data is collected on many program dimensions to make _______ decisions about a program’s operations.

A

informed

Informed decisions are based on accurate and relevant data collected through evaluations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
89
Q

What is the main purpose of outcome evaluations?

A

To demonstrate the nature of client change after receiving services.

Outcome evaluations help determine if clients show improvement due to program interventions.

90
Q

What must we have a clear sense of when conducting an outcome evaluation?

A

What expected changes (the program’s outcomes) we hope to see.

Clear objectives are crucial for assessing the effectiveness of a program.

91
Q

How many major steps are involved in conducting an outcome evaluation?

A

Six major steps.

These steps are illustrated in the referenced figure in Chapter 3.

92
Q

What are the five purposes of conducting outcome evaluations?

A
  • Demonstrate the nature of client change
  • Provide feedback to stakeholders
  • Select the best evidence-based interventions
  • Provide accountability
  • Generate knowledge for the profession

These purposes guide the evaluation process and enhance program effectiveness.

93
Q

True or False: An outcome evaluation can indicate how well a program is working.

A

False.

An outcome evaluation indicates if program objectives are met but does not explain how or why.

94
Q

What does an outcome evaluation primarily evaluate?

A

Whether the program is meeting its program objectives.

This assessment focuses on specific outcomes related to client change.

95
Q

Fill in the blank: A program outcome evaluation is designed for a __________ program.

A

[specific]

Each evaluation is tailored to the unique context of the program being assessed.

96
Q

What is a critical aspect of an outcome evaluation regarding client objectives?

A

To assess only one small component of a complex social problem.

This limitation emphasizes the complexity of social issues addressed by programs.

97
Q

What is an example of a question that an outcome evaluation can answer regarding client outcomes?

A

Was the client outcome achieved?

This question assesses the effectiveness of the program in achieving its goals.

98
Q

What type of data is collected to determine if the program caused changes in clients?

A

Data collected through more complex evaluation designs.

Such designs help establish causality between program interventions and client outcomes.

99
Q

What can complicate the follow-up data collection for outcome evaluations?

A

Clients receiving services from other programs during the follow-up period.

This overlap can obscure the evaluation of the original program’s effectiveness.

100
Q

What aspect of client change might be assessed in an outcome evaluation?

A

The longevity of changes made by clients.

This assessment determines if positive changes are maintained over time.

101
Q

What is a potential limitation of evaluating a single program objective at a time?

A

Limited knowledge gained from evaluating only one component.

Cumulative evaluations over time provide more confidence in the results.

102
Q

What should be done if an evaluation of a program’s objectives turns out to be poor?

A

Investigate why this is so by doing a process evaluation.

Process evaluations can help understand the underlying issues affecting program outcomes.

103
Q

What does the term ‘evidence-based interventions’ refer to?

A

Interventions selected based on their proven effectiveness in creating positive client change.

These interventions are crucial for achieving program objectives.

104
Q

True or False: Outcome evaluations provide a complete picture of a program’s efficiency.

A

False.

Outcome evaluations focus on effectiveness, not efficiency.

105
Q

What is the first step in conducting an outcome evaluation?

A

Engage Stakeholders

Engaging stakeholders involves involving all relevant parties in the evaluation process.

106
Q

Why is it critical to clearly specify a program’s objectives?

A

It defines how we understand our overall program in concrete terms

Clear objectives lead to better program evaluation and effectiveness.

107
Q

What is the second step in conducting an outcome evaluation?

A

Describe the Program

This includes using theory of change and logic models.

108
Q

What should follow-up data collection intervals ideally be?

A

3, 6, or 12 months after clients exit a program

This timing allows for assessing the sustainability of program effects.

109
Q

True or False: Performance objectives and client outcome objectives are the same.

A

False

Performance objectives focus on outputs, while client outcome objectives focus on actual client changes.

110
Q

What are the two steps involved in focusing the evaluation?

A

Selecting program objectives and measuring program objectives

Both steps are key to ensure the evaluation is targeted and meaningful.

111
Q

Fill in the blank: The difficulty in measuring changes in a client’s self-esteem may lead programs to focus on _______.

A

performance objectives

This can misguide the evaluation process.

112
Q

What are the three levels of outcomes identified in the evaluation process?

A
  • Initial Outcomes
  • Intermediate Outcomes
  • Long-Term Outcomes

Each level helps to categorize the objectives for better assessment.

113
Q

What is the third step in conducting an outcome evaluation?

A

Focus the Evaluation

This requires careful selection of objectives and measurement criteria.

114
Q

What does gathering credible evidence for an outcome evaluation include?

A
  • Selecting samples or data sources
  • Pilot-testing measuring instruments
  • Administering measuring instruments

These steps ensure the reliability and validity of the evaluation.

115
Q

How can we measure program objectives effectively?

A

Using standardized measuring instruments with high validity and reliability

This ensures accurate assessment of outcomes.

116
Q

What is a potential challenge in collecting follow-up data?

A

Difficulty in locating clients after they leave a program

This can be particularly challenging with transient or underserved populations.

117
Q

What is the significance of using a logic model in program evaluation?

A

It helps clearly display program objectives and their relationships

Logic models guide the evaluation process by providing a visual representation of program components.

118
Q

What is an example of an initial outcome indicator for outpatient mental health services?

A

Number of consumers who received outpatient services during the quarter

This measures initial awareness and access to services.

119
Q

What is the importance of client follow-up data?

A

It assesses the longevity of changes made by clients

Follow-up data helps determine if the program effects are sustained over time.

120
Q

What is the purpose of assessing consumers every 6 months?

A

To evaluate improvements in well-being/life satisfaction

121
Q

What is Indicator 10?

A

Number and percentage of consumers who report an increase in well-being (life satisfaction)

122
Q

What is the ideal sample size for each subgroup in an outcome evaluation?

A

At least thirty clients

123
Q

What is the main issue affecting sample size in outcome evaluations?

A

Whether program resources exist to collect data from all clients

124
Q

What is the goal of random sampling in program evaluations?

A

Each client has an equal chance of being included in the study

125
Q

True or False: Social workers should evaluate their own performance in outcome evaluations.

A

False

126
Q

What is a critical aspect of random selection in outcome evaluations?

A

The decision to include clients is made without bias

127
Q

Fill in the blank: Quality data collection requires several explicit ______ that need to be laid out and strictly followed.

A

[procedures]

128
Q

What should be done when clients decline to participate in an evaluation?

A

Explore the reasons for their refusal

129
Q

What is the purpose of pilot-testing measuring instruments?

A

To ascertain whether the instrument produces the desired data

130
Q

What should be done if a self-report measuring instrument is used?

A

Check the accuracy of the data using multiple data sources

131
Q

What is the first step in administering measuring instruments?

A

Decide which questions the outcome evaluation will answer

132
Q

What is the significance of aggregating data in outcome evaluations?

A

To provide an overview on client outcomes

133
Q

What must conclusions drawn from outcome evaluations directly come from?

A

The data gathered during the evaluation

134
Q

What are normative data useful for in outcome evaluations?

A

Interpreting client data when measurement occurs at program exit

135
Q

What should be included in reports of outcome data for stakeholders?

A

Concrete and objective results

136
Q

Fill in the blank: When analyzing data in subgroups, we can gain additional ______ for program decision-makers.

A

[information]

137
Q

What is the primary purpose of outcome evaluations?

A

To determine whether client changes have occurred as a result of intervention efforts.

Outcome evaluations provide valid and reliable data for decision-making.

138
Q

What percentage of families with toddlers showed improvement in problem-solving skills according to the analysis?

A

Seventy-five percent.

This contrasts with families that have teens, where almost no improvement was observed.

139
Q

What should be measured to evaluate program outcomes realistically?

A

The amount of average improvement and the number of clients expected to show success.

This helps educate stakeholders about client populations.

140
Q

True or False: 100% success in deterring social issues like drug addiction is a realistic expectation for any program.

A

False.

In some cases, a 50/50 chance of improvement is expected.

141
Q

What is one method to report outcome data over time?

A

Presenting client outcomes from one year to the next to show program trends.

This method helps stakeholders understand program effectiveness.

142
Q

What types of stakeholders should receive outcome results?

A

Key stakeholders, including funders and policymakers.

Routine sharing of outcome data is essential for informed decision-making.

143
Q

Fill in the blank: The likelihood of having evaluation results used is increased when results are presented in a _______.

A

straightforward manner.

144
Q

What is a common obstacle to putting evaluation results into practice?

A

Failing to remember the law of parsimony when presenting the final report.

Reports should be clear and concise for the intended audience.

145
Q

What happens when evaluation results contradict strong predetermined beliefs?

A

There may be resistance to using the findings.

For example, social workers may believe their efforts are always helpful.

146
Q

What is the role of confidentiality in outcome evaluations?

A

To protect client identities by reporting data in aggregate forms.

Summarizing data helps avoid singling out any one client.

147
Q

What does program outcome assessment evaluate?

A

The degree to which the program is meeting its overall objectives.

This usually means assessing the effectiveness of interventions.

148
Q

When are outcome evaluations typically conducted?

A

Before or at the same time as efficiency evaluations.

Efficiency evaluations focus on the program’s operational effectiveness.

149
Q

What is the significance of analyzing data in subgroups during evaluations?

A

It provides important detail for program decision-makers.

This analysis helps identify program strengths and weaknesses.

150
Q

From chapter 12 slide, Resources are known as

A

Inputs

151
Q

From chapter 12 slide, Activities are known as

A

What the program does

152
Q

From chapter 12 slide, Outputs are known as

A

The services that are delivered

153
Q

From chapter 12 slide, Outcomes are

A

For clients

154
Q

From chapter 12 slide, Impacts are

A

For the community or society

155
Q

From chapter 12 slide, Process Evaluations are focused on what?

A

Examining the activities of a program

156
Q

From chapter 12 slide, What is identified during a needs assessment?

A

A needs assessment identifies the need for a program.

157
Q

From chapter 12 slide, What is the focus of a formative evaluation?

A

The initial program and contains information to help “form” and stabilize
the program

158
Q

From chapter 12 slide, What elements are included in process evaluation?

A

Program description
Program monitoring
Quality assurance

159
Q

From chapter 12 slide, What is the purpose of outcome evaluation?

A

The purpose of outcome evaluation is to determine if the program works and if it impacts the target population’s problem as identified in the needs assessment.

160
Q

From chapter 12 slide, What is the primary purpose of a formative evaluation?

A

A formative evaluation is used to guide and direct programs, assess whether a new program was implemented as planned, and adjust and enhance interventions.

161
Q

From chapter 12 slide, When is process evaluation commonly employed?

A

Process evaluation is commonly employed by new programs.

162
Q

From chapter 12 slide, What does process evaluation assess?

A

Process evaluation assesses whether a new program was implemented as planned and what was learned during program implementation.

163
Q

From chapter 12 slide, At what stage of a program can process evaluation be employed?

A

Process evaluation can be employed at any time during a program’s developmental stage.

164
Q

From chapter 12 slide, What can process evaluation help determine regarding program failure?

A

Process evaluation helps determine whether the failure of the program was due to a poor program design or poor implementation.

165
Q

From chapter 12 slide, What elements can be included in a process evaluation?

A

Process evaluation can include program description, program monitoring, and quality assurance.

166
Q

From chapter 12 slide, What can process evaluations help fine-tune?

A

Process evaluations can fine-tune the service delivery process.

167
Q

From chapter 12 slide, How can process evaluations help identify interventions?

A

Process evaluations help identify which interventions work best for whom.

168
Q

From chapter 12 slide, What kind of profiles can process evaluations provide?

A

Process evaluations provide a clear client and staff profile.

169
Q

From chapter 12 slide, What aspect of a program does process evaluation assess regarding its implementation?

A

Process evaluation assesses program fidelity, ensuring the program is being implemented as intended.

170
Q

From chapter 12 slide, What is the purpose of program description in process evaluation?

A

Program description documents the operations of a program and provides necessary data to judge the intensity and reliability with which services are delivered.

171
Q

From chapter 12 slide, How is program monitoring important in process evaluation?

A

Program monitoring helps understand what happened in a program and to whom, ensuring that the program is serving those for whom it was designed, and it tracks progress toward meeting expectations.

172
Q

From chapter 12 slide, What types of data does program monitoring rely on?

A

Program monitoring relies heavily on data captured by agencies, including face-to-face and telephone interviews, surveys, key informant interviews, focus groups, organization record analysis, program documentation analysis, observations, and case studies.

173
Q

From chapter 12 slide, What is the purpose of quality assurance in process evaluation?

A

Quality assurance evaluates compliance with a set of standards, often focusing on the process of treatment, identifying and correcting deficiencies, and ensuring adherence to guidelines and accountability.

174
Q

From chapter 12 slide, How does quality assurance differ from program evaluation?

A

Quality assurance focuses on the process of treatment, rather than outcomes, and is often driven by legislative mandates to promote consistency and treatment fidelity.

175
Q

From chapter 12 slide, What is Step 1 in the process evaluation process?

A

Step 1 is deciding what questions to ask.

176
Q

From chapter 12 slide, What is Step 2 in the process evaluation process?

A

Step 2 is developing data collection instruments.

177
Q

From chapter 12 slide, What is Step 3 in the process evaluation process?

A

Step 3 is developing a data collection monitoring system.

178
Q

From chapter 12 slide, What is Step 4 in the process evaluation process?

A

Step 4 is scoring and analyzing data.

179
Q

From chapter 12 slide, What is Step 5 in the process evaluation process?

A

Step 5 is developing a feedback system.

180
Q

From chapter 12 slide,
What is Step 6 in the process evaluation process?

A

Step 6 is reporting and using the findings, which involves summarizing the results, interpreting the data, and sharing the findings with stakeholders to inform decisions and improvements for the program.

181
Q

From chapter 12 slide,
What are some key questions to ask when deciding what to evaluate in process evaluation?

A

Key questions include:
What is the program background?

What is the client background?

What is the staff profile?

How much service is provided to the client?

What are the program interventions and activities?

What administrative supports are in place?

What is stakeholder satisfaction?

182
Q

From chapter 12 slide, How do process evaluation questions differ from needs assessment and outcome questions?

A

Process evaluation questions focus on program implementation (e.g., service delivery, client background, staff profile), while needs assessment questions identify the need for a program, and outcome questions assess the effectiveness of the program in addressing the identified problem.

183
Q

From chapter 12 slide, Provide an example of a process-oriented question you could ask about your program.

A

An example of a process-oriented question could be: “How effectively are the interventions being delivered to the clients, and are they receiving the intended amount of service?”

184
Q

From chapter 12 slide, What should be considered when monitoring stakeholder participation in process evaluation?

A

Consider who is being monitored and how, particularly accounting for barriers to participation, especially for vulnerable or underprivileged groups.

185
Q

From chapter 12 slide, How are data collection procedures developed in process evaluation?

A

Data collection procedures are developed with input from key stakeholders, considering who has a say in what data is collected and how.

186
Q

From chapter 12 slide, What is the concept of saturation in data collection?

A

Saturation refers to determining how many people need to be engaged to answer evaluation questions, often achieved through periodic analysis during data collection.

187
Q

From chapter 12 slide, What is data fatigue, and why should it be considered in process evaluation?

A

Data fatigue occurs when participants are overtaxed with data collection or monitoring, which can lead to reduced participation or accuracy in responses.

188
Q

From chapter 12 slide, What should be considered when developing data collection instruments and monitoring systems?

A

Factors to consider include ensuring consistency in data collection, addressing sampling concerns, and maintaining reflexivity, or awareness of implicit biases that could affect the analysis.

189
Q

From chapter 12 slide, What is reflexivity in the context of analyzing data?

A

Reflexivity refers to being aware of implicit biases during data analysis and developing strategies to mitigate those biases.

190
Q

From chapter 12 slide, What factors should be considered when disseminating and communicating results?

A

Factors to consider include:
How to share the results

When to share the results

Where to share the results

To whom the results should be shared

191
Q

From chapter 12 slide, Why is it important to share process data?

A

Process data is only helpful if shared, as it enables stakeholders to understand and act on the findings.

192
Q

From Chapter 13 slide, What is the focus of outcome evaluations?

A

Outcome evaluations are focused on the outcomes, specifically what is expected to be seen as a result of the program’s activities.

193
Q

From Chapter 13 slide, What is one purpose of outcome evaluation?

A

One purpose of outcome evaluation is to improve program services to clients.

194
Q

From Chapter 13 slide, How does outcome evaluation assist in decision-making?

A

Outcome evaluation provides feedback for decision-making, helping to guide future program actions and adjustments.

195
Q

From Chapter 13 slide, What knowledge does outcome evaluation generate?

A

Outcome evaluation generates knowledge for the profession, contributing to broader insights and practices.

196
Q

From Chapter 13 slide, What is Step 1 in outcome evaluation?

A

Step 1 is operationalizing program objectives.

197
Q

From Chapter 13 slide, What is Step 2 in outcome evaluation?

A

Step 2 is operationalizing variables and stating the outcomes.

198
Q

From Chapter 13 slide, What is Step 3 in outcome evaluation?

A

Step 3 is designing a monitoring system.

199
Q

From Chapter 13 slide, What is Step 4 in outcome evaluation?

A

Step 4 is analyzing and displaying data.

200
Q

From Chapter 13 slide, What is Step 5 in outcome evaluation?

A

Step 5 is developing a feedback system.

201
Q

From Chapter 13 slide, What is Step 6 in outcome evaluation?

A

Step 6 is disseminating and communicating results.

202
Q

From Chapter 13 slide, What is a theoretical theory in the context of program evaluation?

A

A theoretical theory is a reasoned set of propositions, derived from and supported by established data, which serves to explain a group of phenomena.

203
Q

From Chapter 13 slide, What is a conceptual theory of change?

A

A conceptual theory of change is a representation of how you believe change will occur within your program.

204
Q

From Chapter 13 slide, What is an operational logic model?

A

An operational logic model is a systematic and visual way to present the perceived relationships among the resources available, the activities planned, and the changes or results hoped to be achieved.

205
Q

From Chapter 13 slide, How is evaluation ideally linked to practice?

A

Evaluation is ideally wrapped up with practice, ensuring that the evaluation process informs and improves program implementation.

206
Q

From Chapter 13 slide, What is operationalization in the context of program evaluation?

A

Operationalization is the explicit specification of a program’s objectives in a way that makes measurement of each objective possible.

207
Q

From Chapter 13 slide, What is an operational definition?

A

An operational definition is a clear and specific description of an objective that allows for measurable assessment.

208
Q

From Chapter 13 slide, What is an indicator in program evaluation?

A

An indicator is a specific, observable, and measurable characteristic or change that shows the progress a program is making towards a specified outcome.

209
Q

From Chapter 13 slide, Why is sampling often necessary in program evaluation?

A

Sampling is often necessary due to the size of the population or participants’ choice, as ideally, everyone would be included.

210
Q

From Chapter 13 slide, How does the timing of data collection affect the questions that can be answered?

A

The timing of data collection affects the types of questions that can be answered:

Pre and post-program data are needed for questions on change.

Follow-up data is necessary for questions on sustained outcomes.

Longitudinal data is required for questions on causation

211
Q

From Chapter 13 slide, What are some factors to consider when deciding when data will be collected?

A

Factors to consider include:

What method of data collection will be used (e.g., phone, in person)?

Who will collect the data and whether they are impartial.

212
Q

From Chapter 13 slide, What is important when deciding how data will be collected?

A

It is important to determine the method of collection (e.g., phone, in person), who will collect the data, and whether the person collecting the data is impartial.

213
Q

From Chapter 13 slide, How does the timing of data collection affect the types of questions that can be answered?

A

The timing of data collection affects the types of questions that can be answered:

Questions on change require pre and post-program data.

Questions on sustained outcomes require follow-up data.

Questions on causation require longitudinal data.

214
Q

From Chapter 13 slide, What is an example of a question on change?

A

“To what extent do participants in the Life Remodeled Youth Immersion Program learn about race relations in Phoenix during their participation in the program?” (Requires pre and post-program data)

215
Q

From Chapter 13 slide, What is an example of a question on sustained outcomes?

A

“To what extent do Atlantic Impact participants use the information learned in high school programming during their first year of college?” (Requires follow-up data)

216
Q

From Chapter 13 slide, What is an example of a question on causation?

A

“How does the experience of participating in Atlantic Impact inform students’ matriculation through higher education?” (Requires longitudinal data)

217
Q

What are the differences between process and outcome evaluations?

A

Process evaluations assess how services are delivered and whether they meet expectations, while outcome evaluations measure the effectiveness of the program in achieving long-term changes or improvements in clients’ conditions.

218
Q

What are the differences between process and outcome evaluations?

A

Process evaluations assess how services are delivered and whether they meet expectations, while outcome evaluations measure the effectiveness of the program in achieving long-term changes or improvements in clients’ conditions.

219
Q
A
220
Q

How do process and outcome evaluations relate to a logic model?

A

In a logic model, process evaluations correspond to the activities and outputs columns, focusing on service delivery, while outcome evaluations relate to the outcomes and impacts columns, focusing on client changes and program impact.