midterm Flashcards

1
Q

Criteria + evidence + judgment

A

Trilogy of evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Research question + evidence + interpretation

A

Trilogy of research

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Systematic collection & analysis of data to address criteria to make judgments about the worth/improvement of something

A

Evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Systematic investigation w/in some discipline undertaken to establish facts & principles to contribute to a body of knowledge

A

Research

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Focuses on mission achievement/product delivery; leads to specific decisions (utilization focused); determining worth/social utility

A

Evaluation differences

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Focuses on developing new knowledge/evaluating experimental treatments; leads to generalizable conclusions; explanatory/predictive

A

Research differences

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Follow heart, trust gut

A

Intuitive judgment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Using external experts or set of standards

A

Professional judgment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

To evaluate if the goal (objectives) were achieved

A

Goal-attainment model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

A picture of how program resources end up as results

A

Logic model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

To discover & judge actual effects, outcomes, or impacts w/o preordained idea about what to find

A

Goal-free model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

To establish a working understanding of an organization & if it’s capable of achieving end products/could examine every aspect of the orgs components & org as a whole

A

Systems approach

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Inputs (program investments) [what we invest] ➡️ outputs (activities & participation) [what we do & who we reach] ➡️ outcomes (short, medium, long-term) [what results, what is the value]

A

Logic model components

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Someone who evaluates some aspect of the 5 P’s who is a member of the organization

A

Internal evaluator

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Someone such as a consultant who evaluates from outside & isn’t employed full-time by the organization

A

External evaluator

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Knows the organization; accessible to colleagues; realistic recommendations; can make changes

A

Pros of internal evaluations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Pressure to have positive results; difficult to criticize; may lack training

A

Cons of internal evaluations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

More objectivity; competence; experience; more resources; less pressure to compromise; lower costs

A

Pros of external evaluations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Threat to employees; must get to know organization; may impose values; expensive

A

Cons of external evaluations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Personnel (performance appraisal/evaluation), policies/admin, place (area/facilities), program (qualities & improvement), participants outcomes

A

5 P’s of evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Be realistic about project’s value & limitations; assure anonymity* or confidentiality* ; never force people to participate; using written consent*; keenly no harm; provide the option for participants to know the results

A

Ethical issues of evaluation

22
Q

Researcher/evaluator CANNOT id a given response w/ any given respondent

23
Q

Researcher/evaluator CAN id a given persons responses but promises not to do so publicly

A

Confidentiality

24
Q

Participants in study must base their voluntary participation on an understanding of the possible risks involved

A

Informed consent

25
Q

Consistency of your measurement instrument; degree to which an instrument measures same way each time used under same condition w/ same subjects

A

Reliability

26
Q

Whether or not an instrument measures what it’s designed to measure (internal: content & concurrent, external: predictive)

27
Q

Do questions look like they measure what they’re supposed to? (Internal)

A

Content (face) validity

28
Q

Do you think this measure correlates strongly w/ something that it logically should? (Internal)

A

Concurrent validity

29
Q

Do you imagine that this measure would predict something that it logically should? (External)

A

Predictive validity

30
Q

Data as a form of #’s; follow standard procedures of rigor related to instruments used & statistical data analysis; deductive reasoning*

A

Quantitative data

31
Q

Data consists of words rather than #’s; concern relevance to context or situation; inductive reasoning*

A

Qualitative data

32
Q

Random assignment of subjects to different groups

A

True-experimental designs

33
Q

NO random assignment of subjects to different groups

A

Quasi-experimental designs

34
Q

Probability of selecting members of the population is known; simple random, systematic, stratified random, cluster

A

Probability sampling

35
Q

Every member of a population has an equal chance of being selected

A

Simple random sampling

36
Q

When a list of a population is readily available

A

Systematic sampling

37
Q

The population is divided into sub populations (strata)

A

Stratified random sampling

38
Q

Researcher randomly selects clusters

A

Cluster sampling

39
Q

Probability of selecting each member of the population is unknown; samples are selected in a way that’s not suggested by probably theory; purposive, convenience, quota, expert, snowball*

A

Non-probability sampling

40
Q

Selecting a sample based on the knowledge of a population related to expertise & the purpose of the study

A

Purposive (judgmental) sampling

41
Q

What’s there is there

A

Convenience sampling

42
Q

Units are selected into a sample on the basis of certain characteristics

A

Quota sampling

43
Q

Also known as judgment sampling; like purposive but assumes some have prior knowledge

A

Expert sampling

44
Q

Researcher collects data on members of target population who can be located

A

Snowball sampling

45
Q

Not focused on the numbers of respondents but the contribution each person makes to address the evaluation research purposes

A

Theoretical sampling

46
Q

Standards which something is evaluated/studied

47
Q

Data; piece of info collected/analyzed to determine whether criteria are met

48
Q

Interpretation of the value of something based on evidence collected from predetermined criteria

49
Q

Form a pattern that might be logically expected to observations that test whether the pattern occurs

A

Deductive reasoning

50
Q

Form observations to the discovery of a pattern among all given events

A

Inductive reasoning