R - done Flashcards

0
Q

Post positivism paradigm -
Definition?
Associated w which type of research?

A

Truth can only be approximated because of error in measurement.
More prevalent in quantitative research.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
1
Q

Positivism paradigm -
Definition?
Associated w which type of research?

A

Objective truth exists and can only be understood if directly measurable.
Tied to quantitative research.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Constructivism paradigm -
AKA?
Definition?
Associated w which type of research?

A

Interpretivism.
There are multiple realities or perspectives for any given reality.
Qualitative research.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Critical/ideological paradigm -
Definition?
Associated w which type of research?

A
  • Researchers take a proactive role and
    confront social structure affecting oppressed groups.
  • Qualitative research.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Ethics in research include? (4)

A
  • Informed consent w right to decline.
  • Risks as well as benefits.
  • Human studies review board.
  • Debriefing, especially if deception was used.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Nazi medical war crimes -

Ethics violated?

A

Deceived, exploited,and tortured prisoners in the name of research.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Milgram study -
Studied what?
Ethics violated?

A
  • Milgram obedience study.
  • Study subjects shocked learners when they were incorrect. Shock was at the max 65% of the time
  • Participants were deceived, emotionally harmed and not debriefed,
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Tuskegee study -
Studied what?
Ethics violated?

A
  • Tuskegee syphilis study.
  • Deceived participants, not telling them their correct Dx or that effective Txt of penicillin was available when it came out.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Jewish Chronic Disease Hospital study -
Studied what?
Ethics violated?

A

Subjects & controls were injected w live cancer cells, but not informed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Willowbrook study -
Studied what?
Ethics violated?

A
  • Kids at a school for the mentally disabled were injected w hepatitis.
  • Parents who wanted to enroll their kids signed consent, but were never told they could refuse or about effects.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Legal standards for research? (2)

A
  • Use of human studies review board.

- HIPAA protects private health information.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Human studies review board?

A

IRB must be used by all federally funded institutions who do research w humans,
and any research conducted by such institutions, even if all of it isn’t federally funded.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

45 CFR 46?

A

Code of federal regulations, title 45, part 46, contains policies to guide researchers using human subjects, including use of an institutional review board.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Independent variable?

A

Construct that is manipulated or controlled in some way.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Dependent variable?

A

The outcome variable that is checked for influence by the independent variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Extraneous variables?

A

Other variables, besides the independent variable, that could affect the dependent variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Confounding variable?

A

An extraneous variable that the experimenter has not controlled for and that affects the dependent variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Descriptive vs relational vs causal research Qs?

A
  • Descriptive -examine what exists - counts, averages, descriptive stats
  • Relational - relationship between variables, correlations
  • Causal - cause-effect relationships
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Research hypothesis?

A

A testable concise statement involving the expected relationship between 2 or more variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Directional vs nondirectional hypotheses?

A

Directional - indicates direction of relationship, eg positive correlation.
Nondirectional - doesn’t indicate direction of relationship

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Null hypothesis?

A

There is no relationship between IV & DV.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Alternative hypothesis?

A
  • used to identify extraneous variables, developed to be eliminated.
  • the experimental hypothesis; there is a relationship between IV and DV.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Significance level -
AKA?
Definition?

A

AKA the alpha value
Same as the p value
Threshold for rejecting the null hypothesis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

P value?

A
  • A p value is the likelihood of obtaining a false positive result for the experimental hypothesis.
  • A p value of less than or equal to .05 or .01 indicates significant results.
  • Default is a tailed test: When alpha = .05, there is a .025 cutoff region at both tails.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Type I error?

A
  • Rejecting the null when it’s true, or

- A false positive result for the experimental hypothesis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Type II error?

A
  • Accepting the null hypothesis when it’s false, or

- Rejecting the experimental hypothesis when it’s true.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Alpha?

A

The probability of a type I error.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Beta?

A

The probability of a type II error.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Power?

A
  • The likelihood of detecting a significant relationship between variables when there is one.
  • Power is avoiding a type II error.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

How can power be increased? (6)

A
  • increase alpha
  • increase sample size
  • increase effect size
  • minimize error
  • use a one-tailed test
  • use a nonparametric statistic
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Probability vs nonprobability sampling?

A
  • Probability sampling - all persons in a known population have a chance of being selected, therefore more likely to reflect the whole population.
  • Non-probability sampling - accessing samples of convenience.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

Probability sampling methods? (5)

A
  • simple random
  • systematic
  • stratified random
  • cluster
  • multistage
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Non-probability sampling methods (3)?

A
  • Convenience
  • Purposeful
  • Quota
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

Simple random sampling -
Type?
Definition?

A
  • Probability sampling method.

- Every member of the population has an equal chance of being selected.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

Systematic sampling -
Type?
Definition?

A
  • Probability sampling method.

- Every nth element is chosen.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

Stratified random sampling -
Type?
Definition?

A
  • Probability sampling method.

- A pop is divided into subgroups, eg by gender, race, and samples are drawn randomly from the subgroups.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

Cluster sampling -
Type?
Definition?
Caveat?

A
  • Probability sampling method.
  • Identify/list existing groups
  • Take a random sample of groups
  • No sampling of subjects within groups
  • Less representative than other methods.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

Multi-stage sampling -
Type?
Definition & examples?

A
  • Probability sampling method.
  • Common in cluster sampling.
  • Might include a 2 stage random sample ( eg randomly select 60 schools & then 10 classes from each school.)
  • Might include a 3 stage random sample (eg randomly select 200 school districts, then 20 schools from each district, & then 10 classes from each school.
    And so forth.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q
Convenience sampling -
Type?
Definition? (2)
Caveat?
Example?
A
  • Non probability sampling.
  • An easily accessible population.
    • Most common method.
  • Most likely doesn’t fully represent the population of interest.
  • Eg survey clients willing to participate.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

Purposeful sampling?

A
  • Non probability sampling.
  • Select a sample from a population which will be most informative about a topic of interest.
  • Participants are selected because they represent needed characteristics
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

Quota sampling?

A
  • Non probability sampling.
  • Similar to cluster & stratified, but no randomization.
  • Draw the needed # of participants with the needed characteristics (eg race, gender) from the convenience sample.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

Randomization -
Purpose?
2 Types?

A
  • Helps to maximize the credibility & generalizability of a study’s findings.
  • random assignment
  • random selection.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

Random selection -
Definition?
Related to what type of validity?

A
  • Every member of the population has an equal chance of being selected.
  • Closely related to external validity.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

Random assignment
Definition?
Related to what type of validity?

A
  • Randomly assigning participants to different groups. Ensure groups are equal & any systematic differences are due to chance.
  • Closely related to internal validity.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
44
Q

Experimental vs control groups?

A
  • An experimental or treatment group is receiving active treatment.
  • A control group is a group w similar characteristics that does not receive the experimental treatment.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
45
Q

Types of control groups? (3)

A
  • Wait list
  • Placebo
  • Treatment as usual (TAU)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
46
Q

Blind study?

A

Participants aren’t aware whether they are getting the experimental treatment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
47
Q

Double Blind study?

Reduces risk of?

A
  • Neither experimenter nor Participants know who is getting the experimental treatment.
  • Reduces placebo effect & researcher bias.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
48
Q

Placebo effect?

% showing placebo effect?

A
  • Positive effects without treatment.

- 20-30% of participants may show placebo effect.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
49
Q

Internal validity -
Definition?
What strengthens it? (1)

A
  • Means change in the DV is due to the IV.

- Control of external variables strengthens it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
50
Q

Some threats to internal validity? (10)

A
History 
Selection 
Statistical regression 
Testing - learning the test
Instrumentation 
Attrition 
Maturation 
Diffusion of treatment - groups talk to each other about their txt
Experimenter effects 
Subject effects
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
51
Q

Threat to internal validity- History?

A

Extraneous events occurring during the expt, inside or outside the study.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
52
Q

Threat to internal validity- Selection?

A

Group differences exist before the intervention due to lack of random assignment, eg a co-occurring variable that affects the DV.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
53
Q

Threat to internal validity- Statistical regression? (3)

A
  • Statistical phenomenon of regression toward the mean.
  • Scores of participants who were selected because of their extreme score (eg very depressed) are affected.
  • May look like improvement or worsening.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
54
Q

Threat to internal validity- Testing? (4)

A
  • An Issue when pretests are involved.
  • Practice effects
  • memory effects
  • developing familiarity w the test
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
55
Q

Threat to internal validity- instrumentation?

Examples?

A
  • Changes in the instrument affect results.

- Eg paper, computer, evaluator.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
56
Q

Threat to internal validity- attrition? (2)

A
  • Individuals systematically drop out.

- Esp problem for longitudinal studies.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
57
Q

Threat to internal validity- maturation? (3)

A
  • Changes in a participant over time affect the DV.
  • Tend to be normal developmental changes.
  • Includes fatigue.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
58
Q

Threat to internal validity- diffusion of treatment?

A

Problem when groups have contact & effects of an intervention are felt in another group.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
59
Q

Threat to internal validity- experimenter effects?

2 types?

A
  • Bias of the investigator affects participants.
  • Halo effect.
  • Hawthorne effect.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
60
Q

Halo effect?

A

The investigator’s subjective perception of one characteristic are generalized to perceptions of other traits.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
61
Q

Hawthorne effect -
Definition?
AKA?

A
  • The presence of an investigator influences responses independent of any intervention.
  • AKA reactivity.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
62
Q

Threat to internal validity- subject effects?

A

Participants pick up cues (ie demand characteristics).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
63
Q

External validity -
Definition?
Investigators must?

A
  • The ability to generalize results to a larger group.

- Investigators must describe participants, variables, procedures & settings so readers can ascertain generalizability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
64
Q

Threats to external validity? (5)

A
  • Novelty
  • Experimenter
  • Measurement of the DV
  • Measurement by treatment
  • History by treatment effects
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
65
Q

External validity threat - novelty effect?

A

A txt produces positive results just because it is novel to participants.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
66
Q

External validity threat - experimenter effect -
Definition?
2 types?

A
  • Same as for internal threat: Bias of the investigator affects participants.
  • Halo effect.
  • Hawthorne effect.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
67
Q

External validity threat - history by txt effect?

A

An experiment is conducted in a time period full of contextual factors that can’t be duplicated.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
68
Q

External validity threat - measurement of the DV?

A

Similar to instrumentation threat, the effectiveness of a txt may depend on the type of measurement used.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
69
Q

External validity threat - time of measurement by txt effect?

A

Timing of post test may influence post test results.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
70
Q

Four main types of research?

A
  • Quantitative
  • Qualitative
  • Mixed method
  • Single subject
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
71
Q

Quantitative research - definition? (4)

A
  • Attempts to capture the relationship between 2 things that can be measured numerically.
  • Tests a hypothesis.
  • Descriptive or causal relationship.
  • Results are given in numbers and in terms of statistical significance.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
72
Q

Qualitative research -

  • definition? (2)
  • data? (2)
  • sampling? (1)
  • a couple of types?
A
  • Answers Qs about how a phenomenon occurs.
  • Greater subjectivity.
  • Data in words, rather than numbers.
  • Data include interviews, field notes, photos, video, artifacts.
  • Sampling usually not randomized.
  • Includes case studies, policy evaluation.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
73
Q

Mixed method research -
Definition?
2 pros?
1 con?

A
  • Mix of the qualities of quantitative & qualitative the research.
  • Can strengthen what one method alone can provide.
  • Results may be more generalizable.
  • Can be time consuming.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
74
Q

Two types of Mixed method research?

A
  • Concurrent design

- Sequential design

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
75
Q

Mixed method research - Concurrent design -
Definition?
AKA?

A
  • Qualitative & quantitative data are collected at the same time.
  • AKA triangulation.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
76
Q

Mixed method research - Sequential design -
Definition?
Two types?

A
  • Either qualitative or quantitative data are collected first.
  • Exploratory - qualitative first.
  • Explanatory - quantitative first.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
77
Q

Single subject research design? (SSRD) (3)

A
  • Usually quantitative.
  • Measure how receiving or not receiving txt affects
    a single subject or a group who can be treated as a single subject.
  • Often behavioral.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
78
Q

Specialized research designs that can be both quantitative or qualitative? (6)

A
  • Descriptive
  • Longitudinal
  • Cross-sectional
  • Survey
  • Action research
  • Pilot study
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
79
Q

Descriptive research? (3)

Example?

A
  • Describes a phenomenon
  • No intervention
  • No causal info
  • Eg buying habits
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
80
Q

Longitudinal research?
Definition? (2)
Limitations? (3)

A
  • Repeated assessments over time
  • Track pattern or development
  • Limitations: evaluation costs, cohort effects, attrition
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
81
Q

Cross-sectional research?

Limitations? (2)

A
  • Examines different groups (w similar characteristics) that differ on the variable of interest (eg age) at a particular point in time
  • Limitation: comparisons can only be inferred since the same individuals are not being studied, so the developmental changes observed may not be real changes
  • Limitation: different age groups may have cohort differences, eg historical experiences
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
82
Q

Survey research?
Definition?
Includes? (6)
Caveats? (2)

A
  • Select a sample, administer questions
  • Includes written, oral, questionnaires, surveys, interviews, or written statements from participants
  • Surveys are only as good as their design
  • Capabilities of subjects must be considered
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
83
Q

Action research -
Definition? (2)
Example?

A
  • To improve your own practice or organization.
  • To test new approaches, theories, ideas, teachings
  • Eg needs assessment
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
84
Q

Pilot study -
Definition?
Advantages? (3)

A

Smaller version of a study used to assess feasibility of larger study
Advantages: increase likelihood of success, identify problems, opportunity to revise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
85
Q

3 categories of quantitative research design?

A
  • Nonexperimental
  • Experimental
  • SSRDs- single subject research designs (may contain qualitative components)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
86
Q

Nonexperimental research designs?
Category?
Definition? (2)

A
  • Quantitative
  • Exploratory & descriptive, no interventions
  • Observe & outline the properties of a variable
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
87
Q

Experimental research designs?
Category?
All have in common? (3)

A
  • Quantitative
  • Involve an intervention in which conditions & variables are manipulated
  • Goal - assess cause & effect relationships
  • Random assignment is necessary for most experimental designs
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
88
Q

Single subject research designs? (SSRDs)
Category?
Definition?

A
  • Primarily quantitative, may contain qualitative components
  • Measure behavioral &/or attitudinal changes across time for an individual or a few individuals
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
89
Q

4 types of Nonexperimental research designs?

A
  • Descriptive
  • Comparative
  • Correlational
  • Ex post facto design
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
90
Q

Descriptive research design?
Category & Type of research design?
Definition?
2 kinds?

A
  • Quantitative Non experimental research design
  • Thoroughly describing a variable at one time or over time
  • Simple descriptive designs & Longitudinal designs
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
91
Q

Simple descriptive designs -
Definition?
What type of research design? (2)
A special type of simple descriptive design?

A
  • 1 shot surveys of a variable
  • Quantitative Nonexperimental, descriptive design
  • A special type of simple descriptive design: cross sectional
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
92
Q
Cross sectional designs -
Category & type of research design?
A special type of?
Definition?
Example
A
  • Quantitative non experimental, descriptive research design
  • A special type of simple descriptive design
  • Involve different groups of participants studied at the same time
  • eg the degree of financial support given by alumni who graduated 1yr, 5 yrs, 10yrs ago
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
93
Q

Longitudinal designs?
What category & type of research design?
3 kinds?

A
  • Quantitative nonexperimental, descriptive design

- 3 kinds: trend, cohort, panel.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
94
Q

Trend study -
Category & Type of research design?
Definition?

A
  • Quantitative nonexperimental, descriptive, longitudinal design
  • Involves assessing the general population over time w new individuals sampled each time data are collected
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
95
Q

Cohort study -
Category & Type of research design?
Definition?
Example?

A
  • Quantitative, nonexperimental, descriptive, longitudinal design
  • Assessing the same population over time -
    A cohort sample is a group that experiences some type of event (typically birth) in a selected time. This group may be compared over time to another cohort group or other differing group. This alternate comparison group would not have the same associated event or exposure. These studies compare the lives of the differing groups to draw conclusions.
  • Example: A group of graduates that are the same age from different colleges with the same degree are studied every 5 years on how they have progressed.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
96
Q

Panel study?
Category & Type of research design?
Definition?

A
  • Quantitative, nonexperimental, descriptive, longitudinal design
  • Panel studies measure the same sample of respondents at different points in time. Similar to cohort studies. Panel studies are particular to such things as age bands or common experience such as first births. These studies may include a much smaller group and still maintain national representation.
97
Q

Comparative research design -
Category & Type of research design?
Definition?
Example?

A
  • Quantitative nonexperimental design
  • Allows researcher to say there is a difference between groups, but cannot say it’s causative
  • Eg racial differences in use of MH services
98
Q

Correlational research design -
Category & Type of research design?
Definition?
Possible values?

A

Quantitative nonexperimental design
Describes strength & direction of a relationship between 2 variables
r=.5 is a moderate positive relationship
r=0 is no relationship
r=+-1 is a perfect correlation

99
Q

Coefficient of determination?

A

Square the correlation coefficient

r=.5 squared is .25 or 25% shared variance between the 2 variables

100
Q
Ex post facto research design?
Category & Type of research design?
AKA?
Definition?
Example?
A

Quantitative nonexperimental design
AKA causal-comparative design
Data already collected; IV cannot be manipulated; no randomization; Looks at how an IV potentially affected a DV
Eg using archival data

101
Q

Experimental research designs - 3 general categories of design?

A

Within subject
Between groups
Split plot

102
Q

Within subject design?

A

Experimental research design
Assess changes within participants in a group as they experience an intervention
Could be before & after intervention (repeated measures)
Could be serial interventions

103
Q

Between groups design?

A

Experimental research design
Looking at effects of intervention between 2+ groups
One group is a control

104
Q

Split plot design?

A

Experimental research design
Assess an intervention on a whole group & assess sub-interventions on subgroups
Eg a mentoring club’s effect on careers, with focus on resumes, interviewing, or shadowing

105
Q

3 degrees of experimental research design?

A

Pre-experimental
True-experimental
Quasi-experimental
See table 8.3

106
Q

Pre-experimental research designs?

3 types?

A

Often don’t use control groups; no random assignment
One-group posttest-only design
One-group pretest-posttest design
Nonequivalent groups posttest-only design

107
Q

One-group posttest-only design?

A

Experimental research designs, Pre-experimental

A group receives an intervention and change is measured

108
Q

One-group pretest-posttest design?

A

Experimental research designs, Pre-experimental

A group is evaluated before & after an intervention

109
Q

Nonequivalent groups posttest-only design?

A

Experimental research designs, Pre-experimental
No attempt made to use equivalent groups
An experimental group & a control group are evaluated after intervention

110
Q

Experimental research designs - True-experiment?

A

AKA randomized experimental designs

At least 2 groups for comparison & random assignment

111
Q

What differentiates true & quasi experiments?

A

Random assignment, usually

112
Q

Five types of True-experiment research designs?

A

Randomized pretest post test control group design
Randomized pretest post test comparison group design
Randomized post test only control group design
Randomized post test only comparison group design
Solomon four-group design

113
Q

Randomized pretest post test control group design?

A

True-experiment research design

2 groups, 1 is a control

114
Q

Randomized pretest post test comparison group design?

A

True-experiment research design
2+ groups
Each receives a distinct intervention

115
Q

Randomized post test only control group design?

A

True-experiment research design

treatment & control group

116
Q

Randomized post test only comparison group design?

A

True-experiment research design

At least 2 groups for comparison & no control group

117
Q

Solomon four-group design?

A
True-experiment research design
Rigorously assesses presence of pretest & intervention 
4 groups:
- Pretest, intervention, post test
- Pretest, post test, no intervention
- Intervention, post test
- Post test
118
Q

Quasi experimental designs?

A

No random assignment.

Existing, non equivalent groups - nested data (classrooms, counseling groups) or naturally occurring groups

119
Q

2 types of Quasi experimental designs?

A

Nonequivalent groups pretest posttest control or comparison group designs
Time series design

120
Q

Nonequivalent groups pretest posttest control or comparison group designs?

A

Quasi experimental design
Keep intact groups
Administer pretest, intervention to 1 group or to at least 2 comparison groups, then give post test

121
Q

Time series design?

A

Quasi experimental design
Repeatedly measure before & after an intervention with 1 group only or using a control group. Eg:
O O O X O O O
O O O O O O
Measures are at equal intervals w same tests

122
Q
SSRDs- single subject research designs -
Category of research?
Definition?
How represented?
Example?
A
  • Quantitative research, may contain qualitative components
  • Repeated measures over time for an individual or group
  • A= the baseline data, B= treatment data, C= treatment data
  • Eg Assess effectiveness of programs
123
Q

3 designs of SSRDs- single subject research designs?

A
  • Within series designs - effectiveness of 1 intervention or program
    (A-B or A-B-C)
  • Between series designs - effectiveness of 2+ interventions for a single variable
  • Multiple baseline designs - assess data for a target behavior across multiple individuals, multiple environments/places, or multiple behaviors
124
Q

SSRDs - Within series designs?

4 types

A
  • Effectiveness of 1 intervention or program
  • A-B designs
  • A-B-C designs - measures interaction among treatment components
  • Changing criterion designs - criterion for success becomes more restrictive to see how much incentive is needed for maximum performance
  • Parametric designs - treatments are compared across phases
125
Q

Frequency distribution?

A
  • The number of observations per possible response (or equal sized intervals of responses) for a variable
  • Rows for each response, columns for freq counts, %, cumulative %
126
Q

Frequency polygon?

A

A line graph of the frequency distribution
X = possible values of the variable
Y = frequency count for each value
Data are ordinal, interval or ratio

127
Q

Histogram?

A

Connected-bar graph showing frequencies of values for a variable
Data are ordinal, interval or ratio

128
Q

Bar graphs?

A

Data are nominal

Separated bars for distinct responses

129
Q

Measures of central tendency?

A

Mean
Median
Mode

130
Q

Mean?

A

Average

An outlier can inflate/deflate the mean

131
Q

Median -
How compute?
When use?

A

The middlemost score, 50% of the scores are above, 50% are below. If the number of scores is even, take the average of the 2 middlemost scores
Use when outliers are present or data is skewed

132
Q

Mode?

Types?

A

Most frequently occurring score, not influenced by extreme scores
If there are 2 most frequent scores, it’s bimodal
More than 2, multimodal

133
Q

Variability?

3 types?

A

How dispersed scores are from a measure of central tendency
Range
Standard deviation
Variance

134
Q

Range?

A

Largest value - smallest value + 1 place value

Can be affected by outliers

135
Q

Standard deviation -
Definition?
Percentages under the curve?
Person scores at +2 SD is at what percentile?

A
  • How dispersed scores are around the mean. Most frequently used indicator of variability
 1 SD = 34% of the normal curve
 2 SD = 34+14=48%
 3 SD = 34+14+2=50%
\+- 1 SD = 68%
\+- 2 SD = 95%
\+- 3 SD = 99%
97th percentile
136
Q

Variance?

A

Standard deviation squared

137
Q

Skewness?

A
  • an asymmetrical distribution
138
Q
For positively (vs negatively) skewed -
Where are outliers? 
Where are most scores?
Where is the tail?
Where is mode, mean, median?
Skewness index?
A
- Pos - 
outliers at high end
more scores are lower
tail to the right
mode, median, mean
\+0.01 to +1.00
139
Q

Skewness index looks like?

A

+ for positive skew, - for negative skew

Not skewed = -1.00 to +1.00

140
Q

Kurtosis?

A
  • how peaked or flat a curve is
141
Q

3 types of kurtosis?

A
  • Mesokurtic - normal curve, kurtosis = -1 to +1
  • Leptokurtic - tall & thin, kurtosis > 1
  • Platykurtic - low & wide, kurtosis < -1
142
Q

Inferential statistics -
Definition?
2 types?

A
  • Infers conclusions about a population from sample data. Based on the probability of particular differences occurring
  • Parametric & Nonparametric
143
Q

Statistical assumptions for parametric statistics? (3)

A

Normal distribution, approximately
Randomly selected samples
Interval or ratio scales for each variable

144
Q

When use Nonparametric statistics?

A

When Statistical assumptions aren’t met

Can be used even if they are met, as Nonparametric stats are robust

145
Q

Correlation coefficient indicates what 3 things?

A

Presence of a relationship
Direction (+ or -)
Strength - the higher the absolute value
NOT causation

146
Q

Types of correlation?

A

Pearson
Spearman r - for rank ordered variables
Point biserial - comparing 1 continuous & 1 dichotomous variable
Biserial

147
Q

Perfect correlation?

A

+1.00 or -1.00

148
Q

Spurious correlation?

A

Another variable is really responsible for the relationship

149
Q

Attenuated correlation?

A

Measures are unreliable and show a low relationship

150
Q

Restriction of range problem w correlation?

A

sample isn’t representative

151
Q

Coefficient of determination?

A

The percent of variance shared in correlated variables is the square of the correlation

152
Q

Regression -
Definition?
3 types?

A
  • Regression studies are prediction studies and are extensions of correlational studies
  • Bivariate, Multiple, Logistic regression
153
Q

Bivariate regression?

A

How well scores from IV (predictor variable) predict scores on DV (criterion variable)

154
Q

Multiple regression?

A
  • More than 1 IV/predictor variable is used to predict DV/criterion variable
  • Each predictor is weighted to determine its contribution
  • Generally, the more predictor variables, the stronger the prediction
155
Q

Logistic regression?

A
  • How well scores from 1 or more IVs (predictor variable) predict scores on DV (criterion variable)
  • DV Is dichotomous
156
Q

Parametric statistics -
Used when?
List of 6?

A
- Used when statistical assumptions are met. 
T-test
ANOVA
Factorial ANOVA
ANCOVA
MANOVA
MANCOVA
157
Q

Nonparametric statistics -
Used when? (3)
List of 6?

A
  • Used when:
    • only can make a few assumptions about the distribution of scores
    • data is nominal or ordinal
    • interval or ratio data is skewed
      Chi-square
      Mann-Whitney U
      Kolmogorov-Smirnov Z
      Kruskal-Wallis
      Wilcoxon’s signed ranks
      Friedman’s ranks
158
Q
T-test - 
Para/Nonparametric?
Definition?
2 kinds?
Score looks like?
A
  • Parametric
  • Compares 2 means for 1 DV
  • Independent t-test - 2 separate groups compared on 1 DV
  • Dependent t-test - repeated measures w same group or paired-subject groups
  • Score is a t ratio
159
Q
ANOVA -
Para/Nonparametric?
Definition?
Example?
Score looks like?
Post hoc analysis?
A
  • Parametric
  • At least 1 IV with 3 or more groups/levels
  • Eg income with 3 ranges defining 3 levels
  • Score is an F ratio
  • post hoc analysis allows examination of every possible pairing of groups after finding main effects
160
Q

Factorial ANOVA?
Para/Nonparametric?
Definition?

A
  • Parametric
  • More than 1 IV; not trying to control statistically for a covariate
  • Yields both main effects & interaction effects, using post hoc analysis
161
Q

Analysis of covariance, ANCOVA -
Para/Nonparametric?
Definition?
Example?

A
  • Parametric
  • An IV that is a covariate must be statistically controlled for in order to look at the relationship of other IVs and the DV.
  • Eg removing the effects of gender while looking at the relationship between income & work satisfaction
162
Q

Multiple Analysis of variance, MANOVA -
Para/Nonparametric?
Definition?

A
  • Parametric

- Similar to ANOVA, but w multiple dependent variables

163
Q

Multiple Analysis of covariance, MANCOVA -
Para/Nonparametric?
Definition?

A
  • Parametric

- Similar to ANCOVA, but w multiple dependent variables

164
Q

Chi-square -
Para/Nonparametric?
Definition?
Example?

A
  • Nonparametric
  • Used w 2+ categorical or nominal variables, where each variable contains 2+ categories. All scores must be independent - the same person cannot be in multiple categories of the same variable.
    Observed frequencies are compared to expected frequencies
  • eg decision to quit counseling (Y/N) by type of counseling (CBT, Rogerian)
165
Q

Mann-Whitney U -
Para/Nonparametric?
Definition?
Example?

A
  • Nonparametric
  • analogous to T-test; uses ordinal data; compares ranks from 2 groups
  • eg students in grades 9-12 (IV) with educational aspirations (HS, BA, MA) as DV.
166
Q

Kolmogorov-Smirnov Z -
Para/Nonparametric?
Definition?

A
  • Nonparametric

- analogous to t-test and U test, but for N less than 25.

167
Q

Kruskal-Wallis -
Para/Nonparametric?
Definition?

A
  • Nonparametric

- analogous to ANOVA; IV has 3+ groups/levels

168
Q

Wilcoxon’s signed ranks -
Para/Nonparametric?
Definition?
Example?

A
  • Nonparametric
  • analogous to dependent t-test
  • eg to assess changes in perceived level of competence before & after a training
169
Q

Friedman’s rank test -
Para/Nonparametric?
Definition?

A
  • Nonparametric
  • analogous to dependent t-test
  • may be used w 2+ comparison groups
170
Q

Factor analysis?

Factors?

A

Reduces a larger number of variables to a smaller number of groups or factors.
Factors explain covariation among variables; each factor explains a percentage of variance.
Example: checking construct validity in a test which has several items that are supposed to be about the same construct.

171
Q

2 forms of factor analysis?

A

Exploratory factor analysis, EFA

Confirmatory factor analysis, CFA

172
Q

Exploratory factor analysis, EFA
The 2 steps?
The 3 types within the 1st step?
The 2 types within the 2nd step?

A
  • Extraction of factors - clumping factors of interest
    • principal axis factoring
    • principal components analysis
    • maximum likelihood method
  • Factor Rotation & interpretation of those factors - cleaning them up
    • orthogonal - factors are uncorrelated
    • oblique - factors are correlated
173
Q

Confirmatory factor analysis, CFA?

A

Confirming the EFA results. Most common method is the maximum likelihood method

174
Q

Confirmatory factor analysis - fit index?

A

After attaining a factor solution, one tests how the overall model fits the data.

175
Q

Meta-analysis?

A

Used to combine & synthesize results of numerous similar studies for particular outcome or DVs.

176
Q

Steps in Meta-analysis? (5)

A
  • Establish criteria based on operational definitions
  • Locate studies based on criteria
  • Consider possible IVs
  • Calculate an effect size on any outcome variable in the study. The DV in a meta-analysis is the effect size of the outcome.
  • Effect sizes are grouped according to IV of interest & compared & combined across studies
177
Q

Effect size?

A
  • A measure of the strength of the relationship between 2 variables in a population.
  • An effect size expresses the increase or decrease in achievement of an experimental group (of students) in standard deviation units. If an effect size for a study is 1, the average score of students in the experimental group is 1 standard deviation higher than the average score of the control group.
178
Q

Qualitative research design - definition?

A

Involves the study of processes, participants’ meaning of phenomena, or both, usually in a natural setting

179
Q

Qualitative research design - possible characteristics? (15)

A
  • Process or sequence of a phenomenon
  • Evolving theory
  • Thick description
  • Effect of researchers’ participation
  • Maximize validity
  • Participants’ narratives
  • Purposive sampling
  • In depth and detail
  • Contextual
  • Discovery-oriented
  • Creation of meaning
  • Fieldwork
  • Inductive
  • Document analysis
  • Reflexivity
180
Q

Qualitative research - 7 major research traditions?

A
Case study
Phenomenology 
Grounded theory 
Consensual qualitative research, CQR
Ethnography 
Biography
Participatory action research
181
Q
Case study -
Type of research?
Participants roles?
What is a case?
Example?
A
  • more often qualitative, can be quantitative
    Participants are active in data collection
  • A case is a distinct system of an event, process, setting, or individual or a small group of individuals.
  • Eg: implementation of no child left behind laws, or exploration of a counseling process.
182
Q

Phenomenology -
Type of research?
Definition?
Try to assess?

A
  • Qualitative
  • Used to discover the meaning or essence of participants’ lived experiences.
  • Assess participants’ intentionality (ie internal experience of being conscious of something.
183
Q

Grounded theory?
Type of research?
Definition? (3)

A
  • Qualitative
  • generate theory grounded in data from participants’ perspectives
  • Inductive
  • Theories often explain a process or action
184
Q

Consensual qualitative research, CQR?
Type of research?
Definition? (4)

A
  • Qualitative
  • Combines phenomenology & grounded theory
  • selects participants who are very knowledgeable about a topic, remaining close to data, without major interpretation, with hope of generalizing to larger population
  • researchers often reflect on their own experiences when developing interview Qs
  • consensus is key
  • shared power
185
Q

Ethnography -
Type of research?
Definition? (2)
Example?

A
  • Qualitative
  • researcher describes & interprets a culture - process & experience of culture, socialization process
  • participants observations used
  • eg studying a local community re its methods for addressing MH concerns
186
Q

Biography -
Type of research?
Definition? (3)
Methods?

A
  • Qualitative
  • identify personal meanings individuals give to their social experience
  • gathers stories, explore meanings, examine fit w broader social/historical context
  • methods: life history, oral history
187
Q

Participatory action research -
Type of research?
Definition? (3)
Example?

A
  • Qualitative
  • Focuses on change of the participants & researcher as a result of qualitative inquiry
  • goals are emancipation & transformation
  • researchers critically reflect on power of research as a change agent
  • Eg working w a community agency & its clients to improve the agency
188
Q

Purposive/purposeful sampling?

A
  • Obtain info rich cases for maximum depth & detail

- seek sample sizes that reach saturation-no new data

189
Q

Purposive/purposeful sampling - 15 types?

A
Convenience
Maximum variation 
Homogeneous 
Stratified purposeful 
purposeful random
Comprehensive 
Typical case 
Intense case 
Critical case 
Extreme/variant case 
Snowball/chain/network 
Criterion
Opportunistic/emergent
Theoretical 
Confirming/disconfirming
190
Q

Convenience?

Type of sampling?

A
  • Based on availability
  • Least desirable/trustworthy
  • Purposive/purposeful sampling
191
Q

Maximum variation

Type of sampling?

A
  • Purposive/purposeful sampling

- eg teachers of diverse backgrounds from diverse types of HSs, w training in various grade levels & forms of math

192
Q

Homogeneous?

Type of sampling?

A
  • Purposive/purposeful sampling

- selecting participants w theoretically similar experiences

193
Q

Stratified purposeful?
Type of sampling?
AKA?

A
  • Eg 6 samples of teachers for each type of math course offered in HS
  • Purposive/purposeful sampling
  • AKA samples within samples
194
Q

Purposeful random?

Type of sampling?

A
  • Identifying a sample & randomly selecting participants from it.
  • Purposive/purposeful sampling
195
Q

Comprehensive?
Example?
Useful when?
Type of sampling?

A
  • All teachers at a particular school.
  • Useful when a case has few participants.
  • Purposive/purposeful sampling
196
Q

Typical case?

Type of sampling?

A
  • Selecting the average participant w typical experience

- Purposive/purposeful sampling

197
Q

Intense case?
Example?
Type of sampling?

A
  • Identifying those w intense but not extreme experience
  • Teachers of advanced math courses
  • Purposive/purposeful sampling
198
Q

Critical case?

Type of sampling?

A
  • Sampling those w intense or irregular experience

- Purposive/purposeful sampling

199
Q

Extreme or deviant?

Type of sampling?

A
  • Looking for the boundaries of difference. May look at poles of experience, or just 1 pole
  • Purposive/purposeful sampling
200
Q

Snowball/chain/network ?
Used when?
Type of sampling?

A
  • Participants are found by obtaining recommendations of earlier participants
  • Used when sample is difficult to obtain
  • Purposive/purposeful sampling
201
Q

Criterion?

Type of sampling?

A
  • Selecting cases that meet criteria

- Purposive/purposeful sampling

202
Q

Opportunistic/emergent?

Type of sampling?

A
  • Changing one’s research design to include a particular individual
  • Purposive/purposeful sampling
203
Q

Theoretical?

Type of sampling?

A
  • As theory evolves, sampling those who best contribute info

- Purposive/purposeful sampling

204
Q

Confirming/disconfirming case?

Type of sampling?

A
  • Including cases that add depth or provide exceptions

- Purposive/purposeful sampling

205
Q

Qualitative data collection methods -
Best practice?
The 3 methods?

A
  • Use multiple methods
  • Interviews
  • Observations
  • UnobtrusiveI methods
206
Q

Qualitative data collection - interviews -
3 types of structure?
2 types of subjects?

A
  • Unstructured - no preset Qs
  • Semistructured - flexibility to add/delete Qs
  • Structured - standardized
  • Individuals - sensitive topics
  • Focus groups of 6-12 - get social interaction
207
Q

Qualitative data collection - observational methods -
Gather? (1)
Reflect on? (1)
Use? (4)

A
  • Gather description of setting/context
  • Reflect on content & process
  • Use:
    • fieldwork
    • memoing
    • rubrics
    • participant observer is most common, but zero to full interaction is possible
208
Q

Qualitative data collection - Unobtrusive methods -
Interactions?
Includes?

A
  • usually don’t interact w participants

- photos, videos, docs, archival data, artifacts

209
Q

Qualitative data management- contact summary sheet?

A

A single page snapshot of a specific contact

210
Q

Qualitative data management - document summary form?

A

Similar to contact summary sheet, but for unobtrusive data sources eg letters, photos

211
Q

Qualitative data management - data display?

A

Presents organized data in a table or figure

212
Q

Qualitative data analysis - inductive analysis?

A

Involves searching for keywords & themes without preconceived notions about theories. The data allow notions of a phenomenon to emerge.

213
Q

Qualitative data analysis - steps? (6)

A
  • write memos through out research
  • write initial summary
  • organize & segment the text
  • code the data
  • search for patterns to address research Qs
  • decide on main themes, describe, discuss
214
Q

Qualitative research- trustworthiness -
Definition?
4 components?

A
  • The validity or truthfulness of findings
  • credibility - accurate?
  • transferability - to other contexts
  • Dependability - consistency over time, across researchers
  • confirmability - biases & assumptions controlled?
215
Q

Strategies that maximize trustworthiness of qualitative data? (10)

A

-Prolonged engagement
-Persistent observation - depth
-Triangulation - multiple sources of data
-Peer debriefing - check w peers outside the study
-Member checking - consult participants to verify findings
-Negative case analysis - inconsistencies that might refute findings
-Referential adequacy - check against data collected at various times
during the study
-Thick Description - describe collection & analysis in detail
-Auditing - unbiased outsider to review study
-Reflexive journal - memos done during study to reduce counselor bias

216
Q

Program evaluation -
Definition?
Purpose?
Due to?

A
  • Assessment of a program, at any stage
  • To improve quality
  • Often due to recent emphasis on evidenced based treatment, accountability, and external funding
217
Q

Questions that program evaluation addresses? (8)

A
  • Is a program needed
  • For whom and how long
  • Was the program implemented as planned
  • Resources properly used
  • What are the program outcomes
  • Which programs have the best outcomes
  • Are benefits maintained over time
  • Do benefits outweigh costs
218
Q

How does program evaluation differ from research?

A
  • Program evaluation usually leads to narrow applicability/generalizability
  • Program evaluation usually diffused, done w individuals w different roles
219
Q

4 types of program evaluation?

A

Needs assessment
Process evaluation - ensure activities match plans
Outcome evaluation - how participants are performing
Efficiency analysis - do gains outweigh costs

220
Q

Program evaluation - accountability?

A

Providing feedback to stake holders

221
Q

Program evaluation - stakeholders?

A

Any individuals involved or affected by the program

Eg family, administrators, clients, community leaders, funding agencies, schools

222
Q

Program evaluation - formative evaluations?

A

Evaluation throughout implementation to ensure program is done as planned
With changes as needed from stakeholder feedback

223
Q

Program evaluation - summative evaluation?

A

Assessment of the whole program and the degree to which it meets goals & objectives

224
Q

General steps in program evaluation? (9)

A
  • Identify the program to be evaluated
  • Plan the evaluation - decide research design
  • Conduct needs assessment & make recommendations
  • Define what “success” is - short & long term
  • Select data sources - use multiple
  • Monitor & evaluate program progress
  • Determine the degree to which the program is successful
  • Analyze the program’s efficiency
  • Continue, revise, or stop the program
225
Q

Program evaluation - Needs assessment? (7)

A
  • Look at similar programs, lit review
  • Understand needs of a client population
  • Develop/revise program goals & objectives
  • Establish advisory committee of stake holders
  • Work out details - target pop, stakeholders
  • Develop program objectives
  • Write executive summary - including data sources, data analyses, findings and recommendations
226
Q

ABCD model of developing program objectives?

A
Audience
Behavior 
Conditions
Description
Eg, the client will decrease substance use by 2 beers/week as reported by family
227
Q

Process evaluation -
AKA?
Consists of (3)
Used by?

A
Program monitoring
Evaluate progress at various points to ensure:
-implemented as planned
-met outcomes expected
-methods were the best available 
Often used by government social programs
228
Q

Outcome evaluation -
Definition?
3 aspects that can be evaluated?

A

Measure the effectiveness of the program at the end
Usually by determining 1 of 3 aspects:
-more effective than no intervention
-more effective than another program
-the degree to which it was more effective than another program

229
Q

Efficiency analysis -
AKA?
Example?
Caveat?

A

Cost benefit analysis
Eg, Does a more expensive treatment lead to a better outcome?
Costs aren’t necessarily monetary, and cost benefit may not be the best evaluation method

230
Q

Program evaluation models/strategies? (11)

A
Treatment package strategy or social science research model
Comparative outcome strategy
Dismantling strategy 
Constructive strategy
Parametric strategy
Common factors control group strategy
Moderation design strategy
Objectives-based evaluation model
Expert opinion model
Success case method
Improvement focused approach
231
Q

Program evaluation - Treatment package strategy or social science research model?
Example?

A

Control and treatment group are compared

Eg, Prevention program is done in one community and number of incidences of bullying are compared for 2 communities

232
Q

Program evaluation - Comparative outcome strategy?

Example?

A

2+ programs or interventions are compared

Eg outcomes for a bullying prevention program is compared to those for a program in another community

233
Q

Program evaluation - Dismantling strategy?

Example?

A

Components of a program are evaluated to determine which parts are effective
Eg kids, parents, teachers, counsellors are interviewed about several aspects of a bullying program eg PSAs, workshops, support groups

234
Q

Program evaluation - Constructive strategy?

Example?

A

A new component is added to an already effective program and evaluated for added value
Eg adding a required group for perpetrators of bullying to see if number of victims will decrease even more

235
Q

Program evaluation - Parametric strategy?

Example?

A

A program is evaluated at different stages to determine the best time to evaluate it
Eg evaluate weekly, monthly, every 6 months to see which is best for future program evaluations

236
Q

Program evaluation - Common factors control group strategy?

Example?

A

Determine whether a specific component or common factors of a program results in its effectiveness
Eg kids are surveyed about services to see which were most beneficial to them

237
Q

Program evaluation - moderation design strategy?

Example?

A

Participants and other stakeholders are assessed to see who might benefit most from a program
Eg interview various kids and determine the degree to which they perceived the program to be effective for them

238
Q

Program evaluation - objectives based evaluation model?

Example?

A

Determine if goals and objectives were met

Eg compare the number of times a kid is victimized to the reduction objective

239
Q

Program evaluation - Expert opinion model?

Example?

A

An outside neutral expert examines the program process and outcome
Eg experts in bullying review a program and determine if it should get future funding

240
Q

Program evaluation - success case method?

Example?

A

Info is sought from individuals who benefited most from the program
Eg observe counsellors who seem to be intervening in bullying reports most effectively

241
Q

Program evaluation - improvement focused approach?

Example?

A

Ineffective program components are reviewed to figure out what went wrong
Eg interview a perpetrator after bullying incident to understand why the program is not preventing the behavior