Study Group - Evaluation & Research Flashcards

1
Q

What are nonintervention costs of programs?

A

Resource costs that are part of intervention

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the parameters for addressing utilization of findings?

A
  • Study design
  • Preparation
  • Possible feedback
  • Follow through
  • Information distribution
  • Any possible further uses of information
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Attributes for Evaluation Recommendations

A
  • Defensible
  • Timely
  • Realistic
  • Targeted
  • Simple
  • Specific
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the cost expenditure categories in CEA?

A
  1. Developmental
  2. Production
  3. Implementation
  4. Evaluation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Cost-Minimization Analysis (CMA)

A

type of CEA which program A & program B have identical outcomes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How does cost-effective analysis help decision makers?

A

allocation of limited resources & still achieve desired health benefits

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Where are evaluation indicators created from?

A

Logic Model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is Accuracy (evaluation standard)?

A

Provide accurate information for determining merits of program

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Construct Validity

A

Whether specific measure of concept is associated with 1+ measures that is consistent with theoretically derived hypotheses

*- How accurately inferences about specific features of program reflect constructs
- Underlying theory is correct

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Meta-Analysis

A

Quantitative technique for combing results from multiple, different evaluations on same topic

  • Could provide information as to whether findings are strong over variations of populations,, settings, programs & outcomes
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are intervention costs of programs?

A

All resources used in delivery of intervention

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Translational Research

A

Studying & understanding progression of “bench-to-bedside-to-population”

  • How scientific discoveries lead to effectiveness & efficacy in studies which lead to dissemination into practice
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

_____________ & _____________ aid in program cost effectiveness (in addition to types of analyses).

A
  1. Cost effectiveness ratio
  2. Value threshold
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Types of Evidence (most to least rigorous)

A
  1. Systematic reviews & meta-analysis
  2. Scientific literature
  3. Public health surveillance data
  4. Program evaluations
  5. Reports from community members & other stakeholders (e.g. needs assessment)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What does Implementation Science identify?

A

Identifies factors, processes, & methods that increase likelihood of evidence-based interventions to be adopted & used to sustain improvement in population health

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is external use?

A

Benefits decision makers & administrators not connected with program by considering program in different setting & how to change similar program that is not performing well

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

How data is managed is dependent on what?

A
  • Type of data
  • How data is collected
  • How data is used throughout project lifestyle
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Ordinal measurement & give an example

A

Provides information based on order, sequence, or rank

  • scale from strongly disagree to strongly agree
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Impact Evaluation

A
  • Focuses on ultimate goal, product, or policy
  • Often measured in terms of HEALTH STATUS, MORBIDITY, & MORTALITY
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

How long is data stored according to security rule?

A

5-10 years

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What is the difference between qualitative & quantitative data?

A

Qualitative - describes what is occurring or why is occurring (non-numerically)

Quantitative - Numerical data that describes what is happening

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What does data screening (found in data analysis plan) allow/tell evaluator/researcher?

A
  • Assesses accuracy of data entry
  • How outliers & missing values will be handled
  • If statistical assumptions are met
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What processes can quantitative & qualitative data be useful?

A
  • Program planning
  • Implementation
  • Evaluation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Guidelines for Developing Recommendations

A
  1. Invest time
  2. Start early
  3. Consider all issues as fair game
  4. Case wide net
  5. Work closely with decision makers & program staff
  6. Decide whether recommendations should be as general or specific as possible
  7. Consider program context
  8. Consider program closure
  9. Describe expected benefits/costs
  10. Decide whether change should be incremental vs fundamental
  11. Avoid recommending another evaluation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

What is feasibility (evaluation standard)?

A

Conduct evaluations that are VIABLE & REASONABLE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

What are delimitations?

A

Parameters or boundaries placed on study by researchers that help manage scope of study

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

What is a purpose statement?

A
  • Tool to identify what is to be learned from evaluation and/or research
  • Serves to focus & steer collection & analysis of data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Longitudinal Design

A

data about program collected at 2+ POINTS IN TIME

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Hawthorne Effect

A

If people in intervention group become sensitive to repeated introduction/removal of intervention

  • Threat to Internal Validity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Implementation Evaluation

A

Comprehensive, retrospective determination of extent to which program was delivered as designed & whether variations may have held significant implications for effects of program

  • AKA Process evaluation *
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

What is Instrumental Use?

A

Decision makers change program (expand to other sites, terminate, change how it is implemented) based on answers to evaluation questions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Steps of Effective Evaluation

A
  1. Defining research population
  2. Identifying stakeholders & collaborators
  3. Defining evaluation objective
  4. Selecting research design that meets evaluation objective
  5. Selecting variables for measurement
  6. Selecting sampling procedure
  7. Implementing research plan
  8. Analyzing data
  9. Communicating findings
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

Procedural Equity

A

Maximizing fairness in distribution of services across groups

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

Descriptive Design vs Explanatory Design

A

Descriptive: DESCRIBE events, activities, or behavior that occurred (what went on in program)

Explanatory: EXPLAINS events, activities, or behavior that occurred (improve understanding)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

What is maturation?

A

before-after changes due to changes occurring inside people rather than program

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

Substantive Equity

A

Minimizing disparities in distribution of health across groups or different populations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

What is a type V error?

A
  • Reporting intervention has statistically effect but effect is too small
  • No significance to decision makers
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

What are problematic outliers?

A

Outliers not representative of population

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

Bivariable Analysis

A

Determines whether variables in database are correlated with each other

  • Compares 2+ groups to see whether a characteristic is similar/different
  • Find out whether program outcomes are significantly different between 2 groups OR 1 group overtime (impact evaluation)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

Ratio measurement & give an example

A

Common measurement between each score & have true zero

  • height, weight, age, etc.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

What are the different types of evaluation designs?

A
  1. one group posttest only
  2. one group pre- & posttest
  3. Comparison group posttest only
  4. two group pre- & posttest
  5. one group time series
  6. Multi-group time series
  7. two group retrospective (case control)
  8. two group prospective (cohort)
  9. two group pre- & posttest with random assignment (RCT)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

Test-Retest

A

Same measurement administered at 2 points in time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

What should be tested/assessed for when considering using existing data collection instruments? Why?

A

Literary reading level (using or adapting) to ensure validity of responses

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
44
Q

What are intangible benefits?

A

Non-monetary, subjective, or difficult to measure gains attributable to program intervention

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
45
Q

Cost-Effective Analysis (CEA)

A

Determines differences between 2 programs based on what it costs for delivery of the programs

  • Relationship b/w program cost (input) & impact (output)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
46
Q

Internal Reliability/Consistency

A

Consistency measuring multiple/all items it is meant to measure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
47
Q

What analyses provides information on program cost effectiveness?

A
  1. Cost-benefit Analysis (CBA)
  2. Cost-minimization Analysis (CMA)
  3. Cost-utility Analysis (CUA)
  4. Sensitivity Analysis
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
48
Q

Evaluations should be __________________

A

Useful, feasible, ethical, accurate, & accountable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
49
Q

History (threat to internal validity)

A

before-after changes is due to other factors in the environment rather than the program

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
50
Q

IRB

A

Group of individuals that review potential research proposals that involve human subjects/participants

  • Approval must be granted prior to beginning data collection*

Institutional Review Board

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
51
Q

Formative Evaluation

A

Conducted before program begins

  • designed to produce data & information used to improve program during developmental phase
  • Documents appropriateness & feasibility of program implementation
  • ensure fidelity of program
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
52
Q

Causal Inference

A

Intellectual discipline that considers assumptions, study designs, & estimation strategies

  • Allows researchers to draw conclusions based on data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
53
Q

What data should be demonstrated/included when using it for policy analysis?

A
  • Burden of health of public
  • Priority over other issues
  • Pertinence at local level
  • Interventional benefits
  • Personalization of issue by using stories about how lives as impacted
  • Estimated intervention costs
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
54
Q

What are the characteristics of indicators to ensure credibility?

A
  1. Clearly linked to intervention outcome
  2. Presented in specific, measurable terms
  3. Appropriate for population being served
  4. Feasible given data collection, resources, & skills
  5. Valid & reliable to stakeholders
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
55
Q

Data collection must be _____________ by decision makers & stakeholders

A

Relevant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
56
Q

What are the research/evaluation errors HES should be able to identify?

A
  1. Sampling errors
  2. Lack of precision
  3. Variability of measurement
  4. Selection bias
  5. Instrumental bias
  6. Internal threats to validity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
57
Q

Embedded Design

A

Either qualitative or quantitative has priority or is more vital for answering main question of evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
58
Q

What is a type III error?

A

Rejecting program as ineffective when program was never implemented as intended or technology flaws undermined program effectiveness

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
59
Q

Summative Evaluation

A

Evaluation occurs after program has ended

  • designed to produce data on program’s efficacy or effectiveness during implementation
  • Provides data on extent of achievement of goals regarding learning experience
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
60
Q

What should evaluator consider when choosing evaluation design?

A
  1. Causality
  2. Bias
  3. Retrospective vs Prospective
  4. Time span
  5. Finances
  6. Current political climate
  7. # of participants
  8. Type of data being collected
  9. Data analysis & skills
  10. Access to group to use for comparative purposes
  11. Possibility to distinguish b/w exposed & unexposed to program intervention
  12. Type of outcome being evaluated (unbound vs bound)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
61
Q

Considerations for data collection implementation

A
  1. Find reliable, trustworthy, & skilled people to collect, enter, analyze, & manage data
  2. Define roles, responsibilities, & skills needed
  3. Monitor data collection
  4. Maintain integrity of data collected
  5. Ensure protocols address quality control measures
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
62
Q

Dissemination

A

spreading information widely

  • new publications take 17 years to be widely implemented
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
63
Q

Interval measurement & give an example

A

Common unity of measurement with no true zero

  • Temperature
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
64
Q

What are the advantages to using existing data collection instruments?

A
  • Previously tested for reliability & validity
  • Direct comparison measures
  • Reduced cost
  • User familiarity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
65
Q

Interrater Reliability

A

Correlation between different observers at same point in time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
66
Q

SWOT Analysis

A

Assesses internal & external environment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
67
Q

Convergent Design

A

basic steps in evaluation process implemented independently at same time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
68
Q

What is missing data?

A

Observations that were intended to be made but were not

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
69
Q

What does HIPPA protect?

A

All information in health records, billing, & conversations among individuals & healthcare providers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
70
Q

Continuous Quality Improvement (CQI)

A

Tool to reduce costs while improving quality of services

  • enhances organizational effectiveness
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
71
Q

What are direct costs of programs?

A

All goods, services, & other resources used to deliver intervention

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
72
Q

Internal Validity

A

Degree program caused change that was measured

  • Were changes in participants due to program or by chance?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
73
Q

Population Effectiveness

A

Improving health of populations & communities through medical and/or non-medical services

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
74
Q

What are baseline indicators?

A

Value of indicator prior to implementation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
75
Q

Sequential Design

A

Basic steps in evaluation process implemented sequentially (either qualitative or quantitative first)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
76
Q

Why would HES make modifications to content, format, or presentation of question, questionnaire, or instrument?

A
  • Adapting to data needs
  • To have results that are more versatile & useful
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
77
Q

What does most appropriate data collection instrument depend on?

A
  • Intent of program
  • Intent of evaluation
  • Information being acquired
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
78
Q

What can cause lesser approach (compared to most rigorous available) to be used in either research or evaluation?

A
  1. Ethics
  2. Cost
  3. Politics
  4. Availability of resources
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
79
Q

What is a type IV error?

A
  • Evaluation is conducted for sake of evaluation
  • Questions are asked about program that no one cares about
  • Answers are of no interest to decision makers
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
80
Q

Steps for Conducting CEA

A
  1. Define problem & objectives
  2. Identify alternatives
  3. Describe production relationships
  4. Define perspective/viewpoint of CEA
  5. Identify, measure, & value cost
  6. Identify & measure effectiveness
  7. Discount future costs & effectiveness
  8. Conduct sensitivity analysis
  9. Address equity issues
  10. Use CEA results in decision making
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
81
Q

What are target indicators?

A

Expected value of indicator at a specific point in time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
82
Q

Random Selection

A

Random identification from intended population of those who will be in program and/or evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
83
Q

Ethical Standards of Participant Data

A
  1. Respect for autonomy
  2. Social justice
  3. Promotion of good & avoidance of harm
  4. Have evaluation/research plan that protects privacy of participants
  5. Participant data must be stored, utilized, & disclosed ensuring protection of participant privacy
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
84
Q

Types of Program costs

A
  • Direct costs
  • Intervention costs
  • Indirect costs
  • Nonintervention costs
  • Cost savings vs future costs as result of program/implementation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
85
Q

Multi-Phase Design

A

Evaluations divided into multiple parts/phases that are implemented over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
86
Q

What do performance measures require?

A
  1. Object
  2. Standard - accepted level of performance expected
  3. Indicator - determines whether performance standard is achieved
  4. Measure - quantitative representation of capacity, process, or outcome
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
87
Q

What is a type II error?

A

Inferring program has no impact when it does (occurs when sample size is too small)

88
Q

External Validity

A

Generalizability of results beyond participants

  • Would results be the same with different target population?
89
Q

Social Desirability Effect

A

Bias that occurs when people answer questions in a way they think will make them seem favorable to others

  • Threat to internal validity
90
Q

What is Conceptual Use?

A
  • Evaluations produce new information about what goes on in the program through answers to questions raised about a program
  • Reveals insights about program (what they think of the program, understanding the importance of program) in addressing underlying problem
91
Q

Systems-Analysis Evaluation Model

A

Uses instruments that serve to quantify program’s effects

92
Q

Standards/Steps of Evaluation

A
  1. Engage stakeholders
  2. Describe program
  3. Focus evaluation design
  4. Gather credible evidence
  5. Justify conclusions
  6. Ensure use & share lessons
93
Q

CBPR

A

Research in which evaluators collaborate with community members

  • Improves likelihood of success & stronger impact with target population
94
Q

Data Management Plan

A
  1. Procedures for transferring data from instruments to data analysis software
  2. Scoring guide to tell researcher/evaluation team how to code variables
95
Q

Security Rule

A
  • Establishes rules for safeguarding information
  • Requires IRB
  • Guidance provided by Code of Federal Regulations
96
Q

What measurements are used in descriptive data?

A

Frequency, mean, median, mode

97
Q

Response Bias

A

Intentional or unconscious way individuals select responses

98
Q

Advantages of Meta-Analysis

A
  1. Ability to tell if results are more varied that expected
  2. Derived statistical testing of overall factors/effect size in related studies
  3. Potential generalization to population of studies
  4. Ability to control & use moderators to explain variations between studies
99
Q

Measurement vs Classification

A

Measurement - process of sorting & assigning #s to people in quantitative evaluations

Classification - assigning people into set of categories in qualitative evaluations

100
Q

What aspects are included in an effective evaluation report?

A
  • Timely provision
  • Effective & detailed summary of how stakeholders were involved
  • List of strengths & limitations/weaknesses of findings
101
Q

Evidence-based approach in findings & scientific evidence need to be incorporated into what areas of health programming?

A
  • Decision making
  • Policy development
  • Program implementation
102
Q

What is utility (evaluation standard)?

A

Ensure information needs of intended users are satisfied

103
Q

Unbounded vs Bounded Outcomes

A

Unbounded - possibility of existing before/after program

Bounded - outcomes can only occur once at particular time by a specific event/time

104
Q

What are limitations?

A

Boundaries placed on study by factors or people other than researcher

105
Q

Allocative efficiency

A

Combining inputs to produce maximum health improvements given available resources

106
Q

Measurement tools must be ____________

A

Valid & Reliable

107
Q

Types of Designs (most to least rigorous)

A
  1. Systematic reviews
  2. RCT
  3. Cohort
  4. Case-control
  5. Case series, case reports
  6. Editorials, expert opinion
108
Q

Reliability

A

Whether results can be measured consistently (can be reproduced under similar circumstances)

109
Q

Mediation Analysis

A

Identification of pathway between health promotion program, its impact on hypothesized psychosocial mediators, & its effects on behavioral outcomes

110
Q

Informal Interviewing

A

One–ended conversation with goal of understanding program from respondent’s perspective

  • Continues until no new information is gathered & there is full understanding of the program
111
Q

Types of Program Benefits

A
  • Tangible benefits
  • Intangible benefits
  • Economic
  • Personal health
  • Social
112
Q

Discriminant Validity

A

measure of different concepts correlated to each other

  • type of construct validity
113
Q

What is a sensitivity analysis?

A

systematic approach for determining whether CEA yields same results if different assumptions are made

114
Q

What is construct confounding in regard to construct validity?

A

Failure to define all constructs may result in incomplete construct inferences or confusion among constructs

115
Q

What are evaluation questions designed to do?

A
  1. Designated boundaries for evaluation
  2. Determine what areas of program are the focus
116
Q

What is attrition?

A

differences between program and another group due to loss of people from either or both groups rather than the program

117
Q

Limitations in Comparing Evaluation Results

A
  1. Examine & analyze data to look for patterns, recurring themes, similarities/differences
  2. Address patterns or lack of patterns that justify/don’t justify answers to evaluation questions
  3. Possible reasons for deviations in established patterns
  4. Study how patterns are supported/negated by previous studies or evaluations
118
Q

Process Evaluation

A
  • Any combination of measures that occurs as program is implemented
  • Ensures or improves quality of performance or delivery
  • Assesses how much intervention was provided (dose), to whom, when, & by whom
119
Q

Cost savings vs Future costs of programs?

A

Cost Savings - savings tha occur from prevention or alleviation of disease

Future costs - costs of disease unrelated to intervention

120
Q

Placebo Effect

A

Individual’s health improves after taking fake treatment

  • In control group and they think they are receiving intervention
  • Threat to Internal Validity
121
Q

What are different types of evaluation indicators?

A
  1. Baseline
  2. Target
122
Q

Ways to Measure Reliability

A
  1. Test-Retest
  2. Internal Reliability/Consistency
  3. Split-Half Method
123
Q

Decision-Making Evaluation Model

A
  • Uses instruments that focus on elements that yield context, input, processes, & products to use when making decisions
  • Evaluates criteria that are used for making administrative decisions in the program
124
Q

Split-Half Method

A
  • 2 parallel forms to administered at same point in time
  • Correlation calculated b/w them
125
Q

Nonresponse Bias

A
  • Lack of responses
  • Failure of providing data
  • May be due to attrition
126
Q

Efficiency

A

How well program and/or intervention can produce positive results

few inputs + higher outputs = MORE EFFECIENT

127
Q

How is value threshold used?

A

Determining & allocating resources to intervention rather than another program

128
Q

Cost-Utility Analysis (CUA)

A

type of CEA which outcomes of program A & program B are weighted by their value/quality

129
Q

When are evaluation findings justified?

A

When they are linked to evidence gathered & judged against agreed upon values or standards set by stakeholders

130
Q

Threats to Internal Validity

A
  1. Ambiguous Temporal Precedence
  2. History
  3. Maturation
  4. Testing
  5. Instrumentation
  6. Regression Artifacts
  7. Selection
  8. Attrition
  9. Expectancy threat
  10. Hawthorne effect
  11. Social desirability
  12. Placebo effect
131
Q

What are performance measures?

A

Indicators of process, output, or outcomes that have been developed for use as standardized indicators by health programs, initiatives, practitioners or organizations

132
Q

What is inadequate explanation of constructs in regard to construct validity?

A

Failure to adequately explicate construct may lead to incorrect inferences

133
Q

What is the goal of longitudinal designs?

A

Track changes in factors over time

134
Q

What is goal of CMA?

A

Determine which program has lower cost

135
Q

Implementation Documentation

A

Collecting data specified in process objectives carried out to demonstrate extent of program implementation to FUNDERS

136
Q

What is a disadvantage to using existing data collection tools?

A

Potential for unreliable measures with different population demographics & situations

137
Q

What is a value threshold?

A

Benchmark for designating whether service is cost effective

138
Q

Ambiguous Temporal Precedence

A

Lack of clarity about whether treatment occurred

139
Q

What does implementation assessment provide?

A
  • Managerial guidance & oversight
  • Informs decision making to which aspects of organizational or service utilization plan are ineffective in accomplishing process objectives
140
Q

Content Validity

A

Assesses whether test is representative of all aspects of construct

141
Q

Triangulation

A

Examines changes or lessons learned from different points of view or in different ways

142
Q

What is vote counting?

A

Defines findings as significantly positive/negative OR nonsignificant

143
Q

What is process use?

A

Engagement of designing & conducting evaluation that may lead to better understanding & new ways of thinking about the program

144
Q

Intrarater Reliability

A

Correlation between observations made by same observer at different points in time

145
Q

Cost-Benefit Analysis (CBA)

A

method of economic evaluation which all benefits & costs of program are measured

146
Q

Cross Sectional Design

A

Data about program, events, activities, behaviors, & other factors collected at ONE POINT IN TIME

147
Q

Goal-Free Evaluation Model

A

Instruments provide all outcomes (including unintentional positive/negative outcomes)

148
Q

What needs to be considered when proposing possible explanations of findings?

A
  • Standards
    Analysis & Synthesis
  • Interpretation
  • Judgements
  • Recommendations
149
Q

Nonrandom Error

A

Measure is systematically higher or lower than true score

150
Q

What should HES consider when using existing data collection instruments?

A
  • If item is appropriate for intended purpose
  • If language is appropriate or population
  • Whether test has been performed using sample from intended audience
151
Q

Outcome Evaluation

A
  • Short term, immediate, & observable effects of program leading to desired outcomes
  • What changed about public health problem?
152
Q

Implementation Assessment

A

Ongoing, nearly real-time activity of collecting data for purpose of making timely corrections or modifications to implementation through changes to elements of process theory

*AKA Program or Process Monitoring **

153
Q

What is instrumentation in regard to threat to internal validity?

A

before-after changes due ot changes in the instrument or those administering instrument rather than program

154
Q

What is descriptive data used for?

A

To decrease large quantity of data into few elemental measurements that entirely describe data distribution

155
Q

What are evaluation indicators of program?

A

Information or statistics that provide evidence of progress toward outcomes

156
Q

What is mon-operation bias?

A

Inferences are complicated when definition of construct both underrepresent construct of interest & measures irrelevant constructs

157
Q

Attainment Evaluation Model

A

Uses evaluation standards & instruments upon elements that yield objectives & goals of programD

158
Q

What should HES/researcher consider when choosing method for data collection?

A
  • Specifically target most important elements of study
  • Clearly prove or disprove hypothesis
  • Appropriate to scale of study
  • Do not cost too much or require too much time
159
Q

Clinical Effectiveness

A

Improving health of individual patients through medical care services

160
Q

What is evaluation used in needs assessment?

A
  • Evaluating primary, secondary data, observations, & interviews
  • Evaluating literature
161
Q

What are 2 approaches to meta-analysis?

A

Vote-Counting & Classic (or Glassian) meta-analysis

162
Q

Quality Assurance

A

Using minimum acceptable requirements for processes & standards for outputs

163
Q

What are tangible benefits?

A

Benefits that are quantifiable & measurable

164
Q

Measurement Reliability

A

type of random error which same measure gives same results on repeated applications

165
Q

what is selection in regard to internal validity?

A

Difference between program & another group due to differences in people in the groups rather than the program

166
Q

Multiple Method vs Mixed Method Designs

A

Multiple Method: combining qualitative & quantitative DATA to answer evaluation questions

Mixed Method: combing qualitative & quantitative METHODS to answer evaluation questions

167
Q

What is evaluation used in program implementation?

A

Evaluating progress of program based on health indicators

168
Q

What is regression artifacts in regard to internal validity?

A

If subjects are selected on basis of their extreme score, before-after changes may be affected partly by extreme scores naturally shifting toward mean

169
Q

Statistical Significance

A

Likelihood one would be to get result by chance

  • 0.05 (usually used)
170
Q

What does descriptive data describe?

A

Data that answers a questions

171
Q

Random Assignment

A

Process of determining on random basis who does & does not receive health program/intervention

172
Q

What is testing in regard to threat to internal validity?

A

Before-after changes due to giving pretest rather than program

173
Q

CDC evaluation standards

A

Utility, Feasibility, Propriety, Accuracy

174
Q

Why is process evaluation important?

A
  1. Understanding internal & external forces that can impact activities of program
  2. Maintain and/or improve quality & standards of program performance and delivery
  3. May serve as documentation of provisions & success of those provisions of program
175
Q

What are things to monitor/evaluate to ensure efficiency & effectiveness?

A
  1. Simplicity
  2. Flexibility
  3. Acceptability
  4. Sensitivity (proportion of disease)
  5. Predictive value positive
  6. Representativeness
  7. Timeliness
  8. Stability
176
Q

What can correlates (relationships b/w variables where on is affected/dependent on another) be derived from?

A
  • Reach & effectiveness
  • Size of effect
177
Q

Concurrent Validity

A

Assesses degree measure correlates with an already validated measure

  • Type of Criterion Validity
  • constructs may be same or different
  • Related constructs
178
Q

what can Process Use provide?

A

Collaboration of different perspectives among interest groups

179
Q

What is Classic or Glassian Meta-Analysis?

A
  • Defines questions to be examined
  • Collects studies
  • Codes study features & outcomes
  • Analyzes relations b/w study features & outcomes
180
Q

What is propriety (evaluation standard)?

A

Behave legally, ethically, & with regard for welfare of participants of program and those affected by program

181
Q

Production Efficiency

A

Combining inputs to produce services at lowest cost

182
Q

What is Persuasive Use?

A

Evaluation results used to support or criticize program

183
Q

What should HES do when only using part of data collection instrument to maintain validity?

A
  • Aspects of questions should be retained
  • Give credit for using item/collection tool
184
Q

What does SMOG stand for?

A

Simple Measure of Gobbledegook

185
Q

What are the field procedures for collecting data?

A
  • Protocols for scheduling initial contacts with respondents
  • Introducing instrument to respondent
  • Keeping track of individuals contacted
  • Follow up with non-respondents (when appropriate)
186
Q

What should performance measures be aligned with?

A

Objectives

187
Q

What are indirect costs of programs?

A

Lost or impaired ability to work or engage in leisure activities as a direct result of intervention

188
Q

Divergent Validity

A

Measure of construct does not correlate with other measures that should not be related to

189
Q

HIPPA

A

Federal regulations for protection of privacy of participant data

190
Q

Multivariable Analysis

A

Estimates size & direction of program’s effect in randomized & non-randomized study designs with treatment and control group

191
Q

What are beneficial outliers?

A

Outliers that are representative of population

192
Q

Effectiveness

A

Degree of how successful program is in producing desired result

193
Q

What are multivariate outliers?

A

Unusual combinations of scores on different variables

194
Q

Clinical Significance

A

Likelihood intervention is to have noticeable benefit to participants

195
Q

Validity

A

Accuracy of measurement (do results represent what should be measured)

196
Q

Data Analysis Plan

A
  • How data will be scored & coded
  • How missing data will be managed
  • How outliers will be handled
  • Data screening
197
Q

Convergent Validity

A

Measure of same concept correlated to each other

  • type of construct validity
198
Q

What is mono-method bias?

A

When construct is measured using same method and method is part of the construct itself

199
Q

How does effective data management help evaluator/researcher?

A
  • Organization of data
  • Ability to access data
  • Analysis of data
  • Ensures quality of research
  • Supports published results
200
Q

How can evaluators have less bias in their data collection?

A

use evaluation questions that allow for more than 1 answer

201
Q

Evaluation Plan Framework

A
  1. Organize evaluation process
  2. Procedures for managing & monitoring evaluation
  3. Identify what to evaluate
  4. Formulate questions to be answered
  5. Timeframe for evaluation
  6. Plan for evaluating implementation objectives (process)
  7. Plan for evaluating impact objectives
  8. Targeted outcomes (outcome objectives)
202
Q

What should be used to guide data analysis?

A
  • Research/evaluation questions
  • Level of measurement of data
  • is it for research? evaluation?
203
Q

Rigor

A

Confidence findings/results of evaluation are true representation of what occurred as result of program

204
Q

What types of systematic errors can occur with findings from evaluation/research?

A
  1. Sampling
  2. Design
  3. Implementation
  4. Analysis
205
Q

What specific readability tools are there to help with this?

A

SMOG & FleschKincaid

206
Q

Why must evidence be interpreted in regard to health programs?

A

Allows for determination of significance & drawing relevant inferences to plan future programs/interventions

207
Q

What is included in evidence-based practice approach?

A
  • Combination of best available scientific evidence & data
  • Program planning frameworks
  • Community is engaged
  • Programmatic evaluation
  • Disseminated results
208
Q

Nominal/Dichotomous measurement & give an example

A

Cannot be ordered hierarchically but are mutually exclusive

  • Male/Female
  • Yes/No
209
Q

Predictive Validity

A

Assesses degree measure predicts criterion measure assessed at later time

210
Q

Threats to Construct Validity

A
  1. Inadequate explanation of constructs
  2. Construct cofounding
  3. Non-operation bias
  4. Mono-method bias
  5. Confounding constructs with levels of constructs
211
Q

Equity

A

Maximum potential effect under ideal circumstances

212
Q

Expectancy Effect

A

Occurs when researcher’s expectations influence results

213
Q

Why is conducting meta-analysis important when synthesizing data?

A

Combination of results to answer research hypotheses

214
Q

What is the goal of CUA?

A

Determine which program produces most at lower cost

215
Q

What are confounding variables?

A

Extraneous variables or factors outside scope of intervention that can impact results

216
Q

Face Validity

A

Common acceptance or belief that measure actually measures what it is supposed to measure

  • Expert decides if scale “appears” to measure construct
217
Q

Criterion Validity

A

Measure correlates with outcome