midterm Flashcards
Criteria + evidence + judgment
Trilogy of evaluation
Research question + evidence + interpretation
Trilogy of research
Systematic collection & analysis of data to address criteria to make judgments about the worth/improvement of something
Evaluation
Systematic investigation w/in some discipline undertaken to establish facts & principles to contribute to a body of knowledge
Research
Focuses on mission achievement/product delivery; leads to specific decisions (utilization focused); determining worth/social utility
Evaluation differences
Focuses on developing new knowledge/evaluating experimental treatments; leads to generalizable conclusions; explanatory/predictive
Research differences
Follow heart, trust gut
Intuitive judgment
Using external experts or set of standards
Professional judgment
To evaluate if the goal (objectives) were achieved
Goal-attainment model
A picture of how program resources end up as results
Logic model
To discover & judge actual effects, outcomes, or impacts w/o preordained idea about what to find
Goal-free model
To establish a working understanding of an organization & if it’s capable of achieving end products/could examine every aspect of the orgs components & org as a whole
Systems approach
Inputs (program investments) [what we invest] ➡️ outputs (activities & participation) [what we do & who we reach] ➡️ outcomes (short, medium, long-term) [what results, what is the value]
Logic model components
Someone who evaluates some aspect of the 5 P’s who is a member of the organization
Internal evaluator
Someone such as a consultant who evaluates from outside & isn’t employed full-time by the organization
External evaluator
Knows the organization; accessible to colleagues; realistic recommendations; can make changes
Pros of internal evaluations
Pressure to have positive results; difficult to criticize; may lack training
Cons of internal evaluations
More objectivity; competence; experience; more resources; less pressure to compromise; lower costs
Pros of external evaluations
Threat to employees; must get to know organization; may impose values; expensive
Cons of external evaluations
Personnel (performance appraisal/evaluation), policies/admin, place (area/facilities), program (qualities & improvement), participants outcomes
5 P’s of evaluation
Be realistic about project’s value & limitations; assure anonymity* or confidentiality* ; never force people to participate; using written consent*; keenly no harm; provide the option for participants to know the results
Ethical issues of evaluation
Researcher/evaluator CANNOT id a given response w/ any given respondent
Anonymity
Researcher/evaluator CAN id a given persons responses but promises not to do so publicly
Confidentiality
Participants in study must base their voluntary participation on an understanding of the possible risks involved
Informed consent
Consistency of your measurement instrument; degree to which an instrument measures same way each time used under same condition w/ same subjects
Reliability
Whether or not an instrument measures what it’s designed to measure (internal: content & concurrent, external: predictive)
Validity
Do questions look like they measure what they’re supposed to? (Internal)
Content (face) validity
Do you think this measure correlates strongly w/ something that it logically should? (Internal)
Concurrent validity
Do you imagine that this measure would predict something that it logically should? (External)
Predictive validity
Data as a form of #’s; follow standard procedures of rigor related to instruments used & statistical data analysis; deductive reasoning*
Quantitative data
Data consists of words rather than #’s; concern relevance to context or situation; inductive reasoning*
Qualitative data
Random assignment of subjects to different groups
True-experimental designs
NO random assignment of subjects to different groups
Quasi-experimental designs
Probability of selecting members of the population is known; simple random, systematic, stratified random, cluster
Probability sampling
Every member of a population has an equal chance of being selected
Simple random sampling
When a list of a population is readily available
Systematic sampling
The population is divided into sub populations (strata)
Stratified random sampling
Researcher randomly selects clusters
Cluster sampling
Probability of selecting each member of the population is unknown; samples are selected in a way that’s not suggested by probably theory; purposive, convenience, quota, expert, snowball*
Non-probability sampling
Selecting a sample based on the knowledge of a population related to expertise & the purpose of the study
Purposive (judgmental) sampling
What’s there is there
Convenience sampling
Units are selected into a sample on the basis of certain characteristics
Quota sampling
Also known as judgment sampling; like purposive but assumes some have prior knowledge
Expert sampling
Researcher collects data on members of target population who can be located
Snowball sampling
Not focused on the numbers of respondents but the contribution each person makes to address the evaluation research purposes
Theoretical sampling
Standards which something is evaluated/studied
Criteria
Data; piece of info collected/analyzed to determine whether criteria are met
Evidence
Interpretation of the value of something based on evidence collected from predetermined criteria
Judgment
Form a pattern that might be logically expected to observations that test whether the pattern occurs
Deductive reasoning
Form observations to the discovery of a pattern among all given events
Inductive reasoning