R - done Flashcards

0
Q

Post positivism paradigm -
Definition?
Associated w which type of research?

A

Truth can only be approximated because of error in measurement.
More prevalent in quantitative research.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
1
Q

Positivism paradigm -
Definition?
Associated w which type of research?

A

Objective truth exists and can only be understood if directly measurable.
Tied to quantitative research.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Constructivism paradigm -
AKA?
Definition?
Associated w which type of research?

A

Interpretivism.
There are multiple realities or perspectives for any given reality.
Qualitative research.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Critical/ideological paradigm -
Definition?
Associated w which type of research?

A
  • Researchers take a proactive role and
    confront social structure affecting oppressed groups.
  • Qualitative research.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Ethics in research include? (4)

A
  • Informed consent w right to decline.
  • Risks as well as benefits.
  • Human studies review board.
  • Debriefing, especially if deception was used.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Nazi medical war crimes -

Ethics violated?

A

Deceived, exploited,and tortured prisoners in the name of research.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Milgram study -
Studied what?
Ethics violated?

A
  • Milgram obedience study.
  • Study subjects shocked learners when they were incorrect. Shock was at the max 65% of the time
  • Participants were deceived, emotionally harmed and not debriefed,
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Tuskegee study -
Studied what?
Ethics violated?

A
  • Tuskegee syphilis study.
  • Deceived participants, not telling them their correct Dx or that effective Txt of penicillin was available when it came out.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Jewish Chronic Disease Hospital study -
Studied what?
Ethics violated?

A

Subjects & controls were injected w live cancer cells, but not informed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Willowbrook study -
Studied what?
Ethics violated?

A
  • Kids at a school for the mentally disabled were injected w hepatitis.
  • Parents who wanted to enroll their kids signed consent, but were never told they could refuse or about effects.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Legal standards for research? (2)

A
  • Use of human studies review board.

- HIPAA protects private health information.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Human studies review board?

A

IRB must be used by all federally funded institutions who do research w humans,
and any research conducted by such institutions, even if all of it isn’t federally funded.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

45 CFR 46?

A

Code of federal regulations, title 45, part 46, contains policies to guide researchers using human subjects, including use of an institutional review board.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Independent variable?

A

Construct that is manipulated or controlled in some way.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Dependent variable?

A

The outcome variable that is checked for influence by the independent variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Extraneous variables?

A

Other variables, besides the independent variable, that could affect the dependent variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Confounding variable?

A

An extraneous variable that the experimenter has not controlled for and that affects the dependent variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Descriptive vs relational vs causal research Qs?

A
  • Descriptive -examine what exists - counts, averages, descriptive stats
  • Relational - relationship between variables, correlations
  • Causal - cause-effect relationships
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Research hypothesis?

A

A testable concise statement involving the expected relationship between 2 or more variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Directional vs nondirectional hypotheses?

A

Directional - indicates direction of relationship, eg positive correlation.
Nondirectional - doesn’t indicate direction of relationship

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Null hypothesis?

A

There is no relationship between IV & DV.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Alternative hypothesis?

A
  • used to identify extraneous variables, developed to be eliminated.
  • the experimental hypothesis; there is a relationship between IV and DV.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Significance level -
AKA?
Definition?

A

AKA the alpha value
Same as the p value
Threshold for rejecting the null hypothesis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

P value?

A
  • A p value is the likelihood of obtaining a false positive result for the experimental hypothesis.
  • A p value of less than or equal to .05 or .01 indicates significant results.
  • Default is a tailed test: When alpha = .05, there is a .025 cutoff region at both tails.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Type I error?
- Rejecting the null when it's true, or | - A false positive result for the experimental hypothesis.
25
Type II error?
- Accepting the null hypothesis when it's false, or | - Rejecting the experimental hypothesis when it's true.
26
Alpha?
The probability of a type I error.
27
Beta?
The probability of a type II error.
28
Power?
- The likelihood of detecting a significant relationship between variables when there is one. - Power is avoiding a type II error.
29
How can power be increased? (6)
- increase alpha - increase sample size - increase effect size - minimize error - use a one-tailed test - use a nonparametric statistic
30
Probability vs nonprobability sampling?
- Probability sampling - all persons in a known population have a chance of being selected, therefore more likely to reflect the whole population. - Non-probability sampling - accessing samples of convenience.
31
Probability sampling methods? (5)
- simple random - systematic - stratified random - cluster - multistage
32
Non-probability sampling methods (3)?
- Convenience - Purposeful - Quota
33
Simple random sampling - Type? Definition?
- Probability sampling method. | - Every member of the population has an equal chance of being selected.
34
Systematic sampling - Type? Definition?
- Probability sampling method. | - Every nth element is chosen.
35
Stratified random sampling - Type? Definition?
- Probability sampling method. | - A pop is divided into subgroups, eg by gender, race, and samples are drawn randomly from the subgroups.
36
Cluster sampling - Type? Definition? Caveat?
- Probability sampling method. - Identify/list existing groups - Take a random sample of groups - No sampling of subjects within groups - Less representative than other methods.
37
Multi-stage sampling - Type? Definition & examples?
- Probability sampling method. - Common in cluster sampling. - Might include a 2 stage random sample ( eg randomly select 60 schools & then 10 classes from each school.) - Might include a 3 stage random sample (eg randomly select 200 school districts, then 20 schools from each district, & then 10 classes from each school. And so forth.
38
``` Convenience sampling - Type? Definition? (2) Caveat? Example? ```
- Non probability sampling. - An easily accessible population. - Most common method. - Most likely doesn't fully represent the population of interest. - Eg survey clients willing to participate.
39
Purposeful sampling?
- Non probability sampling. - Select a sample from a population which will be most informative about a topic of interest. - Participants are selected because they represent needed characteristics
40
Quota sampling?
- Non probability sampling. - Similar to cluster & stratified, but no randomization. - Draw the needed # of participants with the needed characteristics (eg race, gender) from the convenience sample.
41
Randomization - Purpose? 2 Types?
- Helps to maximize the credibility & generalizability of a study's findings. - random assignment - random selection.
42
Random selection - Definition? Related to what type of validity?
- Every member of the population has an equal chance of being selected. - Closely related to external validity.
43
Random assignment Definition? Related to what type of validity?
- Randomly assigning participants to different groups. Ensure groups are equal & any systematic differences are due to chance. - Closely related to internal validity.
44
Experimental vs control groups?
- An experimental or treatment group is receiving active treatment. - A control group is a group w similar characteristics that does not receive the experimental treatment.
45
Types of control groups? (3)
- Wait list - Placebo - Treatment as usual (TAU)
46
Blind study?
Participants aren't aware whether they are getting the experimental treatment.
47
Double Blind study? | Reduces risk of?
- Neither experimenter nor Participants know who is getting the experimental treatment. - Reduces placebo effect & researcher bias.
48
Placebo effect? | % showing placebo effect?
- Positive effects without treatment. | - 20-30% of participants may show placebo effect.
49
Internal validity - Definition? What strengthens it? (1)
- Means change in the DV is due to the IV. | - Control of external variables strengthens it
50
Some threats to internal validity? (10)
``` History Selection Statistical regression Testing - learning the test Instrumentation Attrition Maturation Diffusion of treatment - groups talk to each other about their txt Experimenter effects Subject effects ```
51
Threat to internal validity- History?
Extraneous events occurring during the expt, inside or outside the study.
52
Threat to internal validity- Selection?
Group differences exist before the intervention due to lack of random assignment, eg a co-occurring variable that affects the DV.
53
Threat to internal validity- Statistical regression? (3)
- Statistical phenomenon of regression toward the mean. - Scores of participants who were selected because of their extreme score (eg very depressed) are affected. - May look like improvement or worsening.
54
Threat to internal validity- Testing? (4)
- An Issue when pretests are involved. - Practice effects - memory effects - developing familiarity w the test
55
Threat to internal validity- instrumentation? | Examples?
- Changes in the instrument affect results. | - Eg paper, computer, evaluator.
56
Threat to internal validity- attrition? (2)
- Individuals systematically drop out. | - Esp problem for longitudinal studies.
57
Threat to internal validity- maturation? (3)
- Changes in a participant over time affect the DV. - Tend to be normal developmental changes. - Includes fatigue.
58
Threat to internal validity- diffusion of treatment?
Problem when groups have contact & effects of an intervention are felt in another group.
59
Threat to internal validity- experimenter effects? | 2 types?
- Bias of the investigator affects participants. - Halo effect. - Hawthorne effect.
60
Halo effect?
The investigator's subjective perception of one characteristic are generalized to perceptions of other traits.
61
Hawthorne effect - Definition? AKA?
- The presence of an investigator influences responses independent of any intervention. - AKA reactivity.
62
Threat to internal validity- subject effects?
Participants pick up cues (ie demand characteristics).
63
External validity - Definition? Investigators must?
- The ability to generalize results to a larger group. | - Investigators must describe participants, variables, procedures & settings so readers can ascertain generalizability.
64
Threats to external validity? (5)
- Novelty - Experimenter - Measurement of the DV - Measurement by treatment - History by treatment effects
65
External validity threat - novelty effect?
A txt produces positive results just because it is novel to participants.
66
External validity threat - experimenter effect - Definition? 2 types?
- Same as for internal threat: Bias of the investigator affects participants. - Halo effect. - Hawthorne effect.
67
External validity threat - history by txt effect?
An experiment is conducted in a time period full of contextual factors that can't be duplicated.
68
External validity threat - measurement of the DV?
Similar to instrumentation threat, the effectiveness of a txt may depend on the type of measurement used.
69
External validity threat - time of measurement by txt effect?
Timing of post test may influence post test results.
70
Four main types of research?
- Quantitative - Qualitative - Mixed method - Single subject
71
Quantitative research - definition? (4)
- Attempts to capture the relationship between 2 things that can be measured numerically. - Tests a hypothesis. - Descriptive or causal relationship. - Results are given in numbers and in terms of statistical significance.
72
Qualitative research - - definition? (2) - data? (2) - sampling? (1) - a couple of types?
- Answers Qs about how a phenomenon occurs. - Greater subjectivity. - Data in words, rather than numbers. - Data include interviews, field notes, photos, video, artifacts. - Sampling usually not randomized. - Includes case studies, policy evaluation.
73
Mixed method research - Definition? 2 pros? 1 con?
- Mix of the qualities of quantitative & qualitative the research. - Can strengthen what one method alone can provide. - Results may be more generalizable. - Can be time consuming.
74
Two types of Mixed method research?
- Concurrent design | - Sequential design
75
Mixed method research - Concurrent design - Definition? AKA?
- Qualitative & quantitative data are collected at the same time. - AKA triangulation.
76
Mixed method research - Sequential design - Definition? Two types?
- Either qualitative or quantitative data are collected first. - Exploratory - qualitative first. - Explanatory - quantitative first.
77
Single subject research design? (SSRD) (3)
- Usually quantitative. - Measure how receiving or not receiving txt affects a single subject or a group who can be treated as a single subject. - Often behavioral.
78
Specialized research designs that can be both quantitative or qualitative? (6)
- Descriptive - Longitudinal - Cross-sectional - Survey - Action research - Pilot study
79
Descriptive research? (3) | Example?
- Describes a phenomenon - No intervention - No causal info - Eg buying habits
80
Longitudinal research? Definition? (2) Limitations? (3)
- Repeated assessments over time - Track pattern or development - Limitations: evaluation costs, cohort effects, attrition
81
Cross-sectional research? | Limitations? (2)
- Examines different groups (w similar characteristics) that differ on the variable of interest (eg age) at a particular point in time - Limitation: comparisons can only be inferred since the same individuals are not being studied, so the developmental changes observed may not be real changes - Limitation: different age groups may have cohort differences, eg historical experiences
82
Survey research? Definition? Includes? (6) Caveats? (2)
- Select a sample, administer questions - Includes written, oral, questionnaires, surveys, interviews, or written statements from participants - Surveys are only as good as their design - Capabilities of subjects must be considered
83
Action research - Definition? (2) Example?
- To improve your own practice or organization. - To test new approaches, theories, ideas, teachings - Eg needs assessment
84
Pilot study - Definition? Advantages? (3)
Smaller version of a study used to assess feasibility of larger study Advantages: increase likelihood of success, identify problems, opportunity to revise
85
3 categories of quantitative research design?
- Nonexperimental - Experimental - SSRDs- single subject research designs (may contain qualitative components)
86
Nonexperimental research designs? Category? Definition? (2)
- Quantitative - Exploratory & descriptive, no interventions - Observe & outline the properties of a variable
87
Experimental research designs? Category? All have in common? (3)
- Quantitative - Involve an intervention in which conditions & variables are manipulated - Goal - assess cause & effect relationships - Random assignment is necessary for most experimental designs
88
Single subject research designs? (SSRDs) Category? Definition?
- Primarily quantitative, may contain qualitative components - Measure behavioral &/or attitudinal changes across time for an individual or a few individuals
89
4 types of Nonexperimental research designs?
- Descriptive - Comparative - Correlational - Ex post facto design
90
Descriptive research design? Category & Type of research design? Definition? 2 kinds?
- Quantitative Non experimental research design - Thoroughly describing a variable at one time or over time - Simple descriptive designs & Longitudinal designs
91
Simple descriptive designs - Definition? What type of research design? (2) A special type of simple descriptive design?
- 1 shot surveys of a variable - Quantitative Nonexperimental, descriptive design - A special type of simple descriptive design: cross sectional
92
``` Cross sectional designs - Category & type of research design? A special type of? Definition? Example ```
- Quantitative non experimental, descriptive research design - A special type of simple descriptive design - Involve different groups of participants studied at the same time - eg the degree of financial support given by alumni who graduated 1yr, 5 yrs, 10yrs ago
93
Longitudinal designs? What category & type of research design? 3 kinds?
- Quantitative nonexperimental, descriptive design | - 3 kinds: trend, cohort, panel.
94
Trend study - Category & Type of research design? Definition?
- Quantitative nonexperimental, descriptive, longitudinal design - Involves assessing the general population over time w new individuals sampled each time data are collected
95
Cohort study - Category & Type of research design? Definition? Example?
- Quantitative, nonexperimental, descriptive, longitudinal design - Assessing the same population over time - A cohort sample is a group that experiences some type of event (typically birth) in a selected time. This group may be compared over time to another cohort group or other differing group. This alternate comparison group would not have the same associated event or exposure. These studies compare the lives of the differing groups to draw conclusions. - Example: A group of graduates that are the same age from different colleges with the same degree are studied every 5 years on how they have progressed.
96
Panel study? Category & Type of research design? Definition?
- Quantitative, nonexperimental, descriptive, longitudinal design - Panel studies measure the same sample of respondents at different points in time. Similar to cohort studies. Panel studies are particular to such things as age bands or common experience such as first births. These studies may include a much smaller group and still maintain national representation.
97
Comparative research design - Category & Type of research design? Definition? Example?
- Quantitative nonexperimental design - Allows researcher to say there is a difference between groups, but cannot say it's causative - Eg racial differences in use of MH services
98
Correlational research design - Category & Type of research design? Definition? Possible values?
Quantitative nonexperimental design Describes strength & direction of a relationship between 2 variables r=.5 is a moderate positive relationship r=0 is no relationship r=+-1 is a perfect correlation
99
Coefficient of determination?
Square the correlation coefficient | r=.5 squared is .25 or 25% shared variance between the 2 variables
100
``` Ex post facto research design? Category & Type of research design? AKA? Definition? Example? ```
Quantitative nonexperimental design AKA causal-comparative design Data already collected; IV cannot be manipulated; no randomization; Looks at how an IV potentially affected a DV Eg using archival data
101
Experimental research designs - 3 general categories of design?
Within subject Between groups Split plot
102
Within subject design?
Experimental research design Assess changes within participants in a group as they experience an intervention Could be before & after intervention (repeated measures) Could be serial interventions
103
Between groups design?
Experimental research design Looking at effects of intervention between 2+ groups One group is a control
104
Split plot design?
Experimental research design Assess an intervention on a whole group & assess sub-interventions on subgroups Eg a mentoring club's effect on careers, with focus on resumes, interviewing, or shadowing
105
3 degrees of experimental research design?
Pre-experimental True-experimental Quasi-experimental See table 8.3
106
Pre-experimental research designs? | 3 types?
Often don't use control groups; no random assignment One-group posttest-only design One-group pretest-posttest design Nonequivalent groups posttest-only design
107
One-group posttest-only design?
Experimental research designs, Pre-experimental | A group receives an intervention and change is measured
108
One-group pretest-posttest design?
Experimental research designs, Pre-experimental | A group is evaluated before & after an intervention
109
Nonequivalent groups posttest-only design?
Experimental research designs, Pre-experimental No attempt made to use equivalent groups An experimental group & a control group are evaluated after intervention
110
Experimental research designs - True-experiment?
AKA randomized experimental designs | At least 2 groups for comparison & random assignment
111
What differentiates true & quasi experiments?
Random assignment, usually
112
Five types of True-experiment research designs?
Randomized pretest post test control group design Randomized pretest post test comparison group design Randomized post test only control group design Randomized post test only comparison group design Solomon four-group design
113
Randomized pretest post test control group design?
True-experiment research design | 2 groups, 1 is a control
114
Randomized pretest post test comparison group design?
True-experiment research design 2+ groups Each receives a distinct intervention
115
Randomized post test only control group design?
True-experiment research design | treatment & control group
116
Randomized post test only comparison group design?
True-experiment research design | At least 2 groups for comparison & no control group
117
Solomon four-group design?
``` True-experiment research design Rigorously assesses presence of pretest & intervention 4 groups: - Pretest, intervention, post test - Pretest, post test, no intervention - Intervention, post test - Post test ```
118
Quasi experimental designs?
No random assignment. | Existing, non equivalent groups - nested data (classrooms, counseling groups) or naturally occurring groups
119
2 types of Quasi experimental designs?
Nonequivalent groups pretest posttest control or comparison group designs Time series design
120
Nonequivalent groups pretest posttest control or comparison group designs?
Quasi experimental design Keep intact groups Administer pretest, intervention to 1 group or to at least 2 comparison groups, then give post test
121
Time series design?
Quasi experimental design Repeatedly measure before & after an intervention with 1 group only or using a control group. Eg: O O O X O O O O O O O O O Measures are at equal intervals w same tests
122
``` SSRDs- single subject research designs - Category of research? Definition? How represented? Example? ```
- Quantitative research, may contain qualitative components - Repeated measures over time for an individual or group - A= the baseline data, B= treatment data, C= treatment data - Eg Assess effectiveness of programs
123
3 designs of SSRDs- single subject research designs?
- Within series designs - effectiveness of 1 intervention or program (A-B or A-B-C) - Between series designs - effectiveness of 2+ interventions for a single variable - Multiple baseline designs - assess data for a target behavior across multiple individuals, multiple environments/places, or multiple behaviors
124
SSRDs - Within series designs? | 4 types
- Effectiveness of 1 intervention or program - A-B designs - A-B-C designs - measures interaction among treatment components - Changing criterion designs - criterion for success becomes more restrictive to see how much incentive is needed for maximum performance - Parametric designs - treatments are compared across phases
125
Frequency distribution?
- The number of observations per possible response (or equal sized intervals of responses) for a variable - Rows for each response, columns for freq counts, %, cumulative %
126
Frequency polygon?
A line graph of the frequency distribution X = possible values of the variable Y = frequency count for each value Data are ordinal, interval or ratio
127
Histogram?
Connected-bar graph showing frequencies of values for a variable Data are ordinal, interval or ratio
128
Bar graphs?
Data are nominal | Separated bars for distinct responses
129
Measures of central tendency?
Mean Median Mode
130
Mean?
Average | An outlier can inflate/deflate the mean
131
Median - How compute? When use?
The middlemost score, 50% of the scores are above, 50% are below. If the number of scores is even, take the average of the 2 middlemost scores Use when outliers are present or data is skewed
132
Mode? | Types?
Most frequently occurring score, not influenced by extreme scores If there are 2 most frequent scores, it's bimodal More than 2, multimodal
133
Variability? | 3 types?
How dispersed scores are from a measure of central tendency Range Standard deviation Variance
134
Range?
Largest value - smallest value + 1 place value | Can be affected by outliers
135
Standard deviation - Definition? Percentages under the curve? Person scores at +2 SD is at what percentile?
- How dispersed scores are around the mean. Most frequently used indicator of variability ``` 1 SD = 34% of the normal curve 2 SD = 34+14=48% 3 SD = 34+14+2=50% +- 1 SD = 68% +- 2 SD = 95% +- 3 SD = 99% 97th percentile ```
136
Variance?
Standard deviation squared
137
Skewness?
- an asymmetrical distribution
138
``` For positively (vs negatively) skewed - Where are outliers? Where are most scores? Where is the tail? Where is mode, mean, median? Skewness index? ```
``` - Pos - outliers at high end more scores are lower tail to the right mode, median, mean +0.01 to +1.00 ```
139
Skewness index looks like?
+ for positive skew, - for negative skew | Not skewed = -1.00 to +1.00
140
Kurtosis?
- how peaked or flat a curve is
141
3 types of kurtosis?
- Mesokurtic - normal curve, kurtosis = -1 to +1 - Leptokurtic - tall & thin, kurtosis > 1 - Platykurtic - low & wide, kurtosis < -1
142
Inferential statistics - Definition? 2 types?
- Infers conclusions about a population from sample data. Based on the probability of particular differences occurring - Parametric & Nonparametric
143
Statistical assumptions for parametric statistics? (3)
Normal distribution, approximately Randomly selected samples Interval or ratio scales for each variable
144
When use Nonparametric statistics?
When Statistical assumptions aren't met | Can be used even if they are met, as Nonparametric stats are robust
145
Correlation coefficient indicates what 3 things?
Presence of a relationship Direction (+ or -) Strength - the higher the absolute value NOT causation
146
Types of correlation?
Pearson Spearman r - for rank ordered variables Point biserial - comparing 1 continuous & 1 dichotomous variable Biserial
147
Perfect correlation?
+1.00 or -1.00
148
Spurious correlation?
Another variable is really responsible for the relationship
149
Attenuated correlation?
Measures are unreliable and show a low relationship
150
Restriction of range problem w correlation?
sample isn't representative
151
Coefficient of determination?
The percent of variance shared in correlated variables is the square of the correlation
152
Regression - Definition? 3 types?
- Regression studies are prediction studies and are extensions of correlational studies - Bivariate, Multiple, Logistic regression
153
Bivariate regression?
How well scores from IV (predictor variable) predict scores on DV (criterion variable)
154
Multiple regression?
- More than 1 IV/predictor variable is used to predict DV/criterion variable - Each predictor is weighted to determine its contribution - Generally, the more predictor variables, the stronger the prediction
155
Logistic regression?
- How well scores from 1 or more IVs (predictor variable) predict scores on DV (criterion variable) - DV Is dichotomous
156
Parametric statistics - Used when? List of 6?
``` - Used when statistical assumptions are met. T-test ANOVA Factorial ANOVA ANCOVA MANOVA MANCOVA ```
157
Nonparametric statistics - Used when? (3) List of 6?
- Used when: - only can make a few assumptions about the distribution of scores - data is nominal or ordinal - interval or ratio data is skewed Chi-square Mann-Whitney U Kolmogorov-Smirnov Z Kruskal-Wallis Wilcoxon's signed ranks Friedman's ranks
158
``` T-test - Para/Nonparametric? Definition? 2 kinds? Score looks like? ```
- Parametric - Compares 2 means for 1 DV - Independent t-test - 2 separate groups compared on 1 DV - Dependent t-test - repeated measures w same group or paired-subject groups - Score is a t ratio
159
``` ANOVA - Para/Nonparametric? Definition? Example? Score looks like? Post hoc analysis? ```
- Parametric - At least 1 IV with 3 or more groups/levels - Eg income with 3 ranges defining 3 levels - Score is an F ratio - post hoc analysis allows examination of every possible pairing of groups after finding main effects
160
Factorial ANOVA? Para/Nonparametric? Definition?
- Parametric - More than 1 IV; not trying to control statistically for a covariate - Yields both main effects & interaction effects, using post hoc analysis
161
Analysis of covariance, ANCOVA - Para/Nonparametric? Definition? Example?
- Parametric - An IV that is a covariate must be statistically controlled for in order to look at the relationship of other IVs and the DV. - Eg removing the effects of gender while looking at the relationship between income & work satisfaction
162
Multiple Analysis of variance, MANOVA - Para/Nonparametric? Definition?
- Parametric | - Similar to ANOVA, but w multiple dependent variables
163
Multiple Analysis of covariance, MANCOVA - Para/Nonparametric? Definition?
- Parametric | - Similar to ANCOVA, but w multiple dependent variables
164
Chi-square - Para/Nonparametric? Definition? Example?
- Nonparametric - Used w 2+ categorical or nominal variables, where each variable contains 2+ categories. All scores must be independent - the same person cannot be in multiple categories of the same variable. Observed frequencies are compared to expected frequencies - eg decision to quit counseling (Y/N) by type of counseling (CBT, Rogerian)
165
Mann-Whitney U - Para/Nonparametric? Definition? Example?
- Nonparametric - analogous to T-test; uses ordinal data; compares ranks from 2 groups - eg students in grades 9-12 (IV) with educational aspirations (HS, BA, MA) as DV.
166
Kolmogorov-Smirnov Z - Para/Nonparametric? Definition?
- Nonparametric | - analogous to t-test and U test, but for N less than 25.
167
Kruskal-Wallis - Para/Nonparametric? Definition?
- Nonparametric | - analogous to ANOVA; IV has 3+ groups/levels
168
Wilcoxon's signed ranks - Para/Nonparametric? Definition? Example?
- Nonparametric - analogous to dependent t-test - eg to assess changes in perceived level of competence before & after a training
169
Friedman's rank test - Para/Nonparametric? Definition?
- Nonparametric - analogous to dependent t-test - may be used w 2+ comparison groups
170
Factor analysis? | Factors?
Reduces a larger number of variables to a smaller number of groups or factors. Factors explain covariation among variables; each factor explains a percentage of variance. Example: checking construct validity in a test which has several items that are supposed to be about the same construct.
171
2 forms of factor analysis?
Exploratory factor analysis, EFA | Confirmatory factor analysis, CFA
172
Exploratory factor analysis, EFA The 2 steps? The 3 types within the 1st step? The 2 types within the 2nd step?
- Extraction of factors - clumping factors of interest - principal axis factoring - principal components analysis - maximum likelihood method - Factor Rotation & interpretation of those factors - cleaning them up - orthogonal - factors are uncorrelated - oblique - factors are correlated
173
Confirmatory factor analysis, CFA?
Confirming the EFA results. Most common method is the maximum likelihood method
174
Confirmatory factor analysis - fit index?
After attaining a factor solution, one tests how the overall model fits the data.
175
Meta-analysis?
Used to combine & synthesize results of numerous similar studies for particular outcome or DVs.
176
Steps in Meta-analysis? (5)
- Establish criteria based on operational definitions - Locate studies based on criteria - Consider possible IVs - Calculate an effect size on any outcome variable in the study. The DV in a meta-analysis is the effect size of the outcome. - Effect sizes are grouped according to IV of interest & compared & combined across studies
177
Effect size?
- A measure of the strength of the relationship between 2 variables in a population. - An effect size expresses the increase or decrease in achievement of an experimental group (of students) in standard deviation units. If an effect size for a study is 1, the average score of students in the experimental group is 1 standard deviation higher than the average score of the control group.
178
Qualitative research design - definition?
Involves the study of processes, participants' meaning of phenomena, or both, usually in a natural setting
179
Qualitative research design - possible characteristics? (15)
- Process or sequence of a phenomenon - Evolving theory - Thick description - Effect of researchers' participation - Maximize validity - Participants' narratives - Purposive sampling - In depth and detail - Contextual - Discovery-oriented - Creation of meaning - Fieldwork - Inductive - Document analysis - Reflexivity
180
Qualitative research - 7 major research traditions?
``` Case study Phenomenology Grounded theory Consensual qualitative research, CQR Ethnography Biography Participatory action research ```
181
``` Case study - Type of research? Participants roles? What is a case? Example? ```
- more often qualitative, can be quantitative Participants are active in data collection - A case is a distinct system of an event, process, setting, or individual or a small group of individuals. - Eg: implementation of no child left behind laws, or exploration of a counseling process.
182
Phenomenology - Type of research? Definition? Try to assess?
- Qualitative - Used to discover the meaning or essence of participants' lived experiences. - Assess participants' intentionality (ie internal experience of being conscious of something.
183
Grounded theory? Type of research? Definition? (3)
- Qualitative - generate theory grounded in data from participants' perspectives - Inductive - Theories often explain a process or action
184
Consensual qualitative research, CQR? Type of research? Definition? (4)
- Qualitative - Combines phenomenology & grounded theory - selects participants who are very knowledgeable about a topic, remaining close to data, without major interpretation, with hope of generalizing to larger population - researchers often reflect on their own experiences when developing interview Qs - consensus is key - shared power
185
Ethnography - Type of research? Definition? (2) Example?
- Qualitative - researcher describes & interprets a culture - process & experience of culture, socialization process - participants observations used - eg studying a local community re its methods for addressing MH concerns
186
Biography - Type of research? Definition? (3) Methods?
- Qualitative - identify personal meanings individuals give to their social experience - gathers stories, explore meanings, examine fit w broader social/historical context - methods: life history, oral history
187
Participatory action research - Type of research? Definition? (3) Example?
- Qualitative - Focuses on change of the participants & researcher as a result of qualitative inquiry - goals are emancipation & transformation - researchers critically reflect on power of research as a change agent - Eg working w a community agency & its clients to improve the agency
188
Purposive/purposeful sampling?
- Obtain info rich cases for maximum depth & detail | - seek sample sizes that reach saturation-no new data
189
Purposive/purposeful sampling - 15 types?
``` Convenience Maximum variation Homogeneous Stratified purposeful purposeful random Comprehensive Typical case Intense case Critical case Extreme/variant case Snowball/chain/network Criterion Opportunistic/emergent Theoretical Confirming/disconfirming ```
190
Convenience? | Type of sampling?
- Based on availability - Least desirable/trustworthy - Purposive/purposeful sampling
191
Maximum variation | Type of sampling?
- Purposive/purposeful sampling | - eg teachers of diverse backgrounds from diverse types of HSs, w training in various grade levels & forms of math
192
Homogeneous? | Type of sampling?
- Purposive/purposeful sampling | - selecting participants w theoretically similar experiences
193
Stratified purposeful? Type of sampling? AKA?
- Eg 6 samples of teachers for each type of math course offered in HS - Purposive/purposeful sampling - AKA samples within samples
194
Purposeful random? | Type of sampling?
- Identifying a sample & randomly selecting participants from it. - Purposive/purposeful sampling
195
Comprehensive? Example? Useful when? Type of sampling?
- All teachers at a particular school. - Useful when a case has few participants. - Purposive/purposeful sampling
196
Typical case? | Type of sampling?
- Selecting the average participant w typical experience | - Purposive/purposeful sampling
197
Intense case? Example? Type of sampling?
- Identifying those w intense but not extreme experience - Teachers of advanced math courses - Purposive/purposeful sampling
198
Critical case? | Type of sampling?
- Sampling those w intense or irregular experience | - Purposive/purposeful sampling
199
Extreme or deviant? | Type of sampling?
- Looking for the boundaries of difference. May look at poles of experience, or just 1 pole - Purposive/purposeful sampling
200
Snowball/chain/network ? Used when? Type of sampling?
- Participants are found by obtaining recommendations of earlier participants - Used when sample is difficult to obtain - Purposive/purposeful sampling
201
Criterion? | Type of sampling?
- Selecting cases that meet criteria | - Purposive/purposeful sampling
202
Opportunistic/emergent? | Type of sampling?
- Changing one's research design to include a particular individual - Purposive/purposeful sampling
203
Theoretical? | Type of sampling?
- As theory evolves, sampling those who best contribute info | - Purposive/purposeful sampling
204
Confirming/disconfirming case? | Type of sampling?
- Including cases that add depth or provide exceptions | - Purposive/purposeful sampling
205
Qualitative data collection methods - Best practice? The 3 methods?
- Use multiple methods - Interviews - Observations - UnobtrusiveI methods
206
Qualitative data collection - interviews - 3 types of structure? 2 types of subjects?
- Unstructured - no preset Qs - Semistructured - flexibility to add/delete Qs - Structured - standardized - Individuals - sensitive topics - Focus groups of 6-12 - get social interaction
207
Qualitative data collection - observational methods - Gather? (1) Reflect on? (1) Use? (4)
- Gather description of setting/context - Reflect on content & process - Use: - fieldwork - memoing - rubrics - participant observer is most common, but zero to full interaction is possible
208
Qualitative data collection - Unobtrusive methods - Interactions? Includes?
- usually don't interact w participants | - photos, videos, docs, archival data, artifacts
209
Qualitative data management- contact summary sheet?
A single page snapshot of a specific contact
210
Qualitative data management - document summary form?
Similar to contact summary sheet, but for unobtrusive data sources eg letters, photos
211
Qualitative data management - data display?
Presents organized data in a table or figure
212
Qualitative data analysis - inductive analysis?
Involves searching for keywords & themes without preconceived notions about theories. The data allow notions of a phenomenon to emerge.
213
Qualitative data analysis - steps? (6)
- write memos through out research - write initial summary - organize & segment the text - code the data - search for patterns to address research Qs - decide on main themes, describe, discuss
214
Qualitative research- trustworthiness - Definition? 4 components?
- The validity or truthfulness of findings - credibility - accurate? - transferability - to other contexts - Dependability - consistency over time, across researchers - confirmability - biases & assumptions controlled?
215
Strategies that maximize trustworthiness of qualitative data? (10)
-Prolonged engagement -Persistent observation - depth -Triangulation - multiple sources of data -Peer debriefing - check w peers outside the study -Member checking - consult participants to verify findings -Negative case analysis - inconsistencies that might refute findings -Referential adequacy - check against data collected at various times during the study -Thick Description - describe collection & analysis in detail -Auditing - unbiased outsider to review study -Reflexive journal - memos done during study to reduce counselor bias
216
Program evaluation - Definition? Purpose? Due to?
- Assessment of a program, at any stage - To improve quality - Often due to recent emphasis on evidenced based treatment, accountability, and external funding
217
Questions that program evaluation addresses? (8)
- Is a program needed - For whom and how long - Was the program implemented as planned - Resources properly used - What are the program outcomes - Which programs have the best outcomes - Are benefits maintained over time - Do benefits outweigh costs
218
How does program evaluation differ from research?
- Program evaluation usually leads to narrow applicability/generalizability - Program evaluation usually diffused, done w individuals w different roles
219
4 types of program evaluation?
Needs assessment Process evaluation - ensure activities match plans Outcome evaluation - how participants are performing Efficiency analysis - do gains outweigh costs
220
Program evaluation - accountability?
Providing feedback to stake holders
221
Program evaluation - stakeholders?
Any individuals involved or affected by the program | Eg family, administrators, clients, community leaders, funding agencies, schools
222
Program evaluation - formative evaluations?
Evaluation throughout implementation to ensure program is done as planned With changes as needed from stakeholder feedback
223
Program evaluation - summative evaluation?
Assessment of the whole program and the degree to which it meets goals & objectives
224
General steps in program evaluation? (9)
- Identify the program to be evaluated - Plan the evaluation - decide research design - Conduct needs assessment & make recommendations - Define what "success" is - short & long term - Select data sources - use multiple - Monitor & evaluate program progress - Determine the degree to which the program is successful - Analyze the program's efficiency - Continue, revise, or stop the program
225
Program evaluation - Needs assessment? (7)
- Look at similar programs, lit review - Understand needs of a client population - Develop/revise program goals & objectives - Establish advisory committee of stake holders - Work out details - target pop, stakeholders - Develop program objectives - Write executive summary - including data sources, data analyses, findings and recommendations
226
ABCD model of developing program objectives?
``` Audience Behavior Conditions Description Eg, the client will decrease substance use by 2 beers/week as reported by family ```
227
Process evaluation - AKA? Consists of (3) Used by?
``` Program monitoring Evaluate progress at various points to ensure: -implemented as planned -met outcomes expected -methods were the best available Often used by government social programs ```
228
Outcome evaluation - Definition? 3 aspects that can be evaluated?
Measure the effectiveness of the program at the end Usually by determining 1 of 3 aspects: -more effective than no intervention -more effective than another program -the degree to which it was more effective than another program
229
Efficiency analysis - AKA? Example? Caveat?
Cost benefit analysis Eg, Does a more expensive treatment lead to a better outcome? Costs aren't necessarily monetary, and cost benefit may not be the best evaluation method
230
Program evaluation models/strategies? (11)
``` Treatment package strategy or social science research model Comparative outcome strategy Dismantling strategy Constructive strategy Parametric strategy Common factors control group strategy Moderation design strategy Objectives-based evaluation model Expert opinion model Success case method Improvement focused approach ```
231
Program evaluation - Treatment package strategy or social science research model? Example?
Control and treatment group are compared | Eg, Prevention program is done in one community and number of incidences of bullying are compared for 2 communities
232
Program evaluation - Comparative outcome strategy? | Example?
2+ programs or interventions are compared | Eg outcomes for a bullying prevention program is compared to those for a program in another community
233
Program evaluation - Dismantling strategy? | Example?
Components of a program are evaluated to determine which parts are effective Eg kids, parents, teachers, counsellors are interviewed about several aspects of a bullying program eg PSAs, workshops, support groups
234
Program evaluation - Constructive strategy? | Example?
A new component is added to an already effective program and evaluated for added value Eg adding a required group for perpetrators of bullying to see if number of victims will decrease even more
235
Program evaluation - Parametric strategy? | Example?
A program is evaluated at different stages to determine the best time to evaluate it Eg evaluate weekly, monthly, every 6 months to see which is best for future program evaluations
236
Program evaluation - Common factors control group strategy? | Example?
Determine whether a specific component or common factors of a program results in its effectiveness Eg kids are surveyed about services to see which were most beneficial to them
237
Program evaluation - moderation design strategy? | Example?
Participants and other stakeholders are assessed to see who might benefit most from a program Eg interview various kids and determine the degree to which they perceived the program to be effective for them
238
Program evaluation - objectives based evaluation model? | Example?
Determine if goals and objectives were met | Eg compare the number of times a kid is victimized to the reduction objective
239
Program evaluation - Expert opinion model? | Example?
An outside neutral expert examines the program process and outcome Eg experts in bullying review a program and determine if it should get future funding
240
Program evaluation - success case method? | Example?
Info is sought from individuals who benefited most from the program Eg observe counsellors who seem to be intervening in bullying reports most effectively
241
Program evaluation - improvement focused approach? | Example?
Ineffective program components are reviewed to figure out what went wrong Eg interview a perpetrator after bullying incident to understand why the program is not preventing the behavior