SCD data collection, data analysis, and designs Flashcards

1
Q

Importance of detailed behavioral definitions and technological descriptions of the importance of behavior-change programs

A

Baer, Wolf, & Risley (1968)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Alternating-treatments design – Different interventions are alternated during the intervention phases

A

Ulman & Sultzer-Azaroff (1975)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Simultaneous-treatment design – Different conditions are administered in the same phase, usually on the same day

A

Hersen & Barlow (1976)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

SCR has been used to document interventions that are functionally related to change in socially important outcomes

A

Wolf (1978)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Six features of SCR visual analysis: (1) level, (2) trend, (3) variability, (4) immediacy of effect, (5) overlap, and (6) consistency of data patterns across similar phases

A

Parsonson & Baer (1978)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Four types of experimental validity – Internal, external, construct, and statistical conclusion

A

Cook & Campbell (1979)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Adapted ATD – Compare instructional practices with non-reversible behaviors (functional, developmental, academic)

A

Sindelar, Rosenberg, & Wilson (1985)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q
  • Documentation of functional relationship requires compelling demonstration of an effect
  • Compromised when: Long latency between manipulation of independent variable and change in dependent variable, mean changes across conditions are small, trends do not conform to those predicted
A

Parsonson & Baer (1992)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q
  • SCR employs within- and between-subjects comparisons to control for major threats to internal validity
  • Requires systematic replication to enhance external validity
A

Martella, Nelson, & Marchand-Martella (1999)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q
  • SCR is experimental and its purpose is to document functional relationships between independent and dependent variables
  • Involves replication of the intervention in the experiment – Introduction and withdrawal, iterative manipulation, staggered introduction
  • SPED is a field that emphasizes individual student as unit of concern, active intervention – responders and nonresponders, and practical procedures for behavioral intervention and experimental effects in educational conditions, testing of conceptual theory, cost effective (problem-solving discipline)
  • Individual participant is unit of analysis – Each participant serves as their own control; compares performance prior to intervention with during and/or after intervention
  • Visual analysis examples: Level – mean performance during a condition, trend – rate of increase or decrease of the best-fit straight line for the dependent variable within a condition, variability – degree to which performance fluctuates around a mean or slope during a phase, immediacy of effects, overlap, magnitude of changes, consistency
A

Horner et al. (2005)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Knowing precisely why and how change occurs can be important for maximizing the impact of the intervention and extending the intervention to other settings

A

Kadzin (2007)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Regression-based estimators are probably best justified for estimating effect size with sensitivity analyses

A

Shadish, Rindskopf, & Hedges (2008)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Propose statistical analysis may add to confidence of visual analysis of data, quantify strengths of outcomes, and increase objectivity of analysis

A

Campbell & Herzinger (2010)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Responsibility of researchers to report fidelity of implementation of each step of a behavior-change program by condition

A

Gast (2010)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

• Visual analysis – Four steps
Steps:
• 1. Documentation of predictable baseline pattern
• 2. Examining data within each phase of the study to assess within-phase patterns
• 3. Compare data from each phase with the data in the adjacent phase to determine effect
• 4. Integrate all information from the phases of the study to determine whether there are at least 3 demonstrations of an effect at different points in time
• Immediacy of effect – Change in level between the last three data points in one phase and the first three data points of the next
• Overlap – Proportion of data from one phase that overlaps with data from the previous phase
• Consistency of data in similar phases – Looking at data from all phases within the same condition and examining the extent to which there is consistency in the data patterns from phases with the same conditions
• No agreed upon methods or standards for effect size estimation

A

Kratochwill et al. (2010)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q
  • Threads to validity – Methodological issues that are likely to rival the explanation that it was the intervention that explained the effect
  • Internal validity – Extent to which an experiment rules out alternative explanations of the results
  • Threats to internal validity – Factors or influences other than the independent variable that could explain the results : History (any event occurring at time of experiment that could influence results), maturation, instrumentation, testing (effects of repeated assessment), statistical regression, diffusion of treatment
  • External validity – Extent to which the results of an experiment can be generalized or extend beyond the conditions of the experiment
  • Threats to external validity – Characteristics of the experiment that may limit the generality of the results: Generality across subjects, responses or measures, settings, time, behavior-change agents; reactive experimental arrangements, multiple-treatment interference
  • Construct validity – Explanation of the causal relation between the intervention and outcome (Is the reason for the relation between the intervention and behavior change due to the construct given by the investigator?)
  • Threats to construct validity – Attention and contact accorded the client; special stimulus conditions, settings, and contexts
  • Data-evaluation validity – Aspects of data can interfere with drawing valid inferences
  • Threats to data-evaluation validity – Excessive variability, unreliability of measures, trends, insufficient, mixed data patterns
  • Reliability – Consistency of measure or measurement procedure
  • Interrater reliability – Extent to which different assessors, raters, or observers agree on the scores they provide when assessing, coding, or classifying subjects’ performance (percent agreement, pearson product-moment correlations, kappa)
  • General requirements – Continuous assessment, baseline assessment, stability in performance (trend in the data, variability in the data)
A

Kazdin (1982, 2011)

17
Q
  • Graphically represented to analyze trend, level, and stability
  • All conditions remain constant within exception of the introduction of one variable in the intervention condition
  • Researchers strive for IOA between 80 and 100%
  • Implementation fidelity gives credence to interpretations of data using visual analysis
  • Primary goal of visual analysis is to identify if a functional relation exists between the introduction of an intervention and change in a socially desirable behavior, as well as replicate effects across multiple participants
  • Allows researchers to analyze each participant’s behavior through repeated measurement and evaluation, observe abrupt and subtle changes over time
  • Generality – Provide detailed descriptions of participants preintervention behaviors to increase likelihood of understanding for whom and under what conditions interventions may be effective
  • Agreement for results of visual analysis
A

Lane & Gast (2013)

18
Q

Visual analysis currently provides the best option to analyze effects of single-subject studies

A

Cook et al. (2014)

19
Q

Three demonstrations is the minimum required for the design to be considered experimental in nature (causal attributions can be made and functional relations can be demonstrated)

A

Gast, Ledford, & Severini (2018)

20
Q

MB and MP differ in one way – frequency in which pre-intervention data are collection; MB require plan for continuous measurements of all targets prior to intervention; MP collects data intermittently prior to introduction of intervention

A

Gast, Lloyd, & Ledford (2018)