AOR 4 Evaluation & Research Flashcards

1
Q

Both program evaluations and research require designing plans that include ____________________-

A

selecting adapting or creating valid & reliable data collection instruments, developing sampling plans, collecting & managing data, analyzing collected data, interpreting results, applying findings & communicating results

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the primary focus of evaluations? Research?

A

Evaluations - determining if program objectives have been met/success of program

Research - identifying new knowledge or practices that answer questions/test hypotheses about health-related theory, behavior, or phenomenon

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Delimitations & give examples

A

decisions made by evaluator or researcher that are used to help identify parameters & boundaries set for a study (helps to manage scope of study)

  • why some literature is not reviewed
  • why populations are not studied
  • why certain methods are not used
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Evaluation

A

series of steps that evaluators use to assess a process or program to provide evidence and feedback about the program

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Limitations

A

phenomena evaluator/researcher cannot control that place restrictions on methods & conclusions

  • e.g. time, nature of data collection, instruments, sample, analysis
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Logic Model

A

depict aspects of a program to include inputs, outputs, & outcomes

  • provide scaled down, somewhat linear, visual depiction of programs
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Research

A

organized process in which a researcher uses the scientific method to generate new knowledge

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Reliability

A

Consistency, dependability, & stability of measurement process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Validity

A

degree to which a test/assessment measure what it is intended to measure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Unit of Analysis

A

what or who is being studied or evaluated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Formative Evaluation

A

Process of assessing quality of a program during planning and implementation

  • used by evaluators & researchers
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Process Evaluation

A

measures quality of performance and delivery throughout program implementation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Summative Evaluation

A

evaluation occurs after program has ended

  • Designed to produce data and information on the program’s efficacy or effectiveness during its implementation phase
  • Provides data on the extent of the achievement of goals (impact & outcome)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Impact Evaluation

A

Immediate & observable effects of a program leading to desired outcomes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Outcome Evaluation

A

assesses whether there is also a demonstrable change in the targeted health outcome

  • What changed about the public health problem
  • Focuses on ultimate goal, product, or policy
  • measured in health status, morbidity, mortality, etc.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Choosing what instrument to use to collect data is based on ______________

A

goal of data collection, population under investigation, & resources of those trying to collect data

  • data collection instruments have both advantages & disadvantages
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

HES must follow _________________ when planning & conducting evaluation

A

federal & state laws and regulations, organizational & institutional policies, & professional standards

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Data instruments (whether new, existing, adapted) should be tested for ________________

A

literacy reading level

19
Q

Why should data collecting instruments be tested?

A

To ensure validity of responses

20
Q

Examples of instrument readability tools

A

SMOG and FleschKincaid

21
Q

Advantages and Disadvantages of using existing data collection instruments

A

Advantages: previously tested for reliability, validity, direct comparison measures, reduced cost (compared to creating new instrument), & user familiarity

Disadvantage: potential for unreliable measures given different population demographics & situations

22
Q

What should HES consider before using an existing data collection instrument?

A
  • if item is appropriate for intended purpose
  • if language appropriate for population
  • whether test has been performed using sample from intended population
  • to whom you should give credit for using instrument
23
Q

Difference between research and evaluation

A

Research: conducted with intent to generalize findings

Evaluation: determine if a specific program was effective

24
Q

The most rigorous evaluation or research design available should be used. However, what are things that could require less rigorous approach?

A

Ethics, cost, politics, & resources

25
Q

Considerations of implementation

A
  • find reliable, trustworthy, skilled people to COLLECT, ENTER, ANALYZE, & MANAGE data to ensure quality results
  • define roles, responsibilities, & skills needed to collect QUALITATIVE data(focus groups, interviews, etc) which may differ when collecting QUANTITATIVE data (surveys)
  • monitor data collection to ensure implementation of process will assist in maintaining established time frames & objectives
  • maintain integrity of data collected & ensure protocols address quality control measures (during both collection and entering of data)
26
Q

Field procedures for data collection

A
  • protocols for scheduling initial contacts with respondents
  • Introducing instrument to respondent
  • keeping track of individuals contacted
  • following up with non-respondents when appropriate
27
Q

All ata must be carefully ____________ into a useable format

A

coded & organized

28
Q

Research and Evaluation can help record ______________ & ____________

A

what changes have occurred & identify what led to those changes

29
Q

Advantages & Disadvantages of online surveys

A

Advantages: cost-effective, versatility, larger potential reach, convenience, anonymity of respondents, risk of data entry errors significantly reduced during transcription,

Disadvantages: lower response rate, language barriers, not feasible for all populations (lack of internet access), risk for multiple response rates

30
Q

What should data management/analysis plan include?

A
  • procedures for transferring data from instruments to data analysis software
  • detail how data will be scored & coded
  • how missing data will be managed
  • how outliers will be handled
  • scoring guide
  • Data Screening - assessing accuracy of data entry
31
Q

What can data screening tell us?

A

If statistical assumptions are met

32
Q

Problematic Outliers

A

not representative of population

33
Q

Beneficial Outliers

A

those representative of population

34
Q

Multivariate Outliers

A

unusual combinations of scores on different variables

  • hard to detect without statistical tests
35
Q

Missing Data

A

observations that were intended to be made but were not

36
Q

What should guide data analysis?

A

research/evaluation questions & level of measurement of data

37
Q

Correlates (relationship/connection where something affects or is dependent on another) can be derived through ______________________

A

interpretation of data, reach & effectiveness or size of effect

38
Q

HES need to incorporate ___________________ to their evaluation & research findings & __________________ into decision making, policy development, & implementation of programs

A

evidence based practice approach; scientific evidence

39
Q

What does evidence-based practice approach look like?

A
  • best available scientific evidence (from literature) & data are combined
  • program planning frameworks are employed
  • engage the community
  • programmatic evaluation is used
  • results are disseminated
40
Q

How doe HES plan for future programs/interventions?

A

interpretation of evidence to determine significance & draw relevant inferences

  • data interpretation is stronger when including key stakeholders and getting community perspective during evaluation process
41
Q

Findings from a study must be analyzed based on specific ________________ of the study

A

Delimitations

e.g . narrowing study by geographic location, time, & population traits

42
Q

Findings from evaluation & research are subject to systematic error known as ____________

A

BIAS - from sampling, design, implementation, or analysis

43
Q

Confounding Variables

A

extraneous variables outside scope of intervention that can impact the results

  • variables that are not accounted for in the study design
44
Q

What should be done before applying recommendations to program or policies?

A

stakeholders have opportunity to review & discuss research or evaluation findings