AOR 4: Eval and Research Flashcards
dissemination
the process of communication procedures, findings, or lesions learned from an evaluation to relevant audiences in a timely, unbiased, and consistent fashion
true experimental designs
include manipulation of at least one independent variable and the research participants are randomly assigned to either the experimental or control group arms of the trial
decision-making components
based on four components designed to provide the user within the context, input, processes, and products with which to make decisions
outcome evaluation
focused on the ultimate goal, product, or policy and is often measured in terms of health status, morbidity, and mortality
convenience sampling
selection of individuals or groups who are available
descriptive statistics
show what the data reveal, as well as provide simple summaries about what the samples’ measure
validity
the consistency, dependability, and stability of the measurement process
descriptive analysis
designed to describe phenomenon specific to a population using descriptive statistics such as raw numbers, percentages, and ratios (exploratory)
quasi-experimental designs
include manipulation of at least one independent variable and they may contain a comparison group; however, due to ethical or practical reasons, random assignment of participants does not occur
test-retest reliability
evidence of stability over time
nonexperimental designs
cross-sectional in nature and do not include manipulation of any kind
process questions
help the evaluator understand phenomena, such as internal and external forces that affect program activities
purposive sampling
researcher makes judgments about who to include in the sample based on study needs
stratified multistage cluster sampling
in several steps, a variable of interest is used to split the sample, and then groups are randomly selected from this sample
impact evaluations
immediate and observable effects of a program leading to the desired outcomes
propriety
behave legally, ethically, and with due regard for the welfare of those involved and those affected
research
organized process in which a researcher uses the scientific method to generate new knowledge
list of factors that affect program decisions
-political environment
-cultural barriers
-funding limitations
-shifting and variable leadership priorities
steps in evaluation practice
- engage stakeholders
- describe the program
- focus the eval design
- gather credible evidence
- ensure use and share lessons learned
unit of analysis
what or who is being studied or evaluated
utilization-focused
accomplished for and with a specific population
criterion validity
refers to a measure’s correlation to another measure of a variable
steps involved in qualitative data analysis
- date reduction (selecting, transforming, focusing, and condensing data)
- data display (creating an organized way of arranging data through a diagram or chart)
- conclusion drawing and verification (data is revisited multiple times to verify, test, or confirm patterns and themes)
stratified random sampling
the sample is split into groups based on a variable of interest, and an equal number of potential participants from each group are selected randomly