Super Hard Stuff (Continued) Flashcards
A rule that describes a contingency of reinforcement.
The environmental arrangements that determine conditions by which behaviors will be reinforced.
Continuous Reinforcement and Extinction are the bookends of all:
Schedules Of Reinforcement
This is between CRF and EXT
Some , but not all, occurrences of the behavior are reinforced.
Used for maintaining behaviors that have already been established.
Helps to fade from artificial to natural reinforcement.
Intermittent Schedules of Reinforcement (INT)
Provides reinforcement for every occurrence of the target behavior.
Utilized for strengthening novel behaviors when teaching is first initiated for a new skill that is being acquired.
Continuous Reinforcement (CRF)
One of the basic schedules of intermittent reinforcement.
Constant, set criteria (does not change at all); and a certain # of occurrences of the behavior have to occur before one response produces reinforcement.
The only graph with STEPS
Fixed Ratio Schedule
4 Basic Schedules of Intermittent (INT) Reinforcement:
Hint: FVFV
- ) Fixed Ratio
- ) Variable Ratio
- ) Fixed Interval
- ) Variable Interval
The strongest basic schedule of INT reinforcement.
Has changing, variable criteria; AVERAGE; mean of responses; and a # of occurrences of the target have to occur before one response produces reinforcement.
This graph has a super steep line
Variable Ratio Schedule
Pattern of Responding: The individual completes required responses with little hesitation. POST-REINFORCEMENT PAUSE follows reinforcement. Large ratios= long pauses; short ratios= short pauses.
Rate of Responding: Produces high rates of responses; the higher the ratio requirement, the higher the rate of response.
Fixed Ratio (FR)
Ratio means:
Amount
Interval means:
Time
Fixed means:
Consistent; constant
Variable means:
Variation; change
Pattern of responding: consistent, steady response. DO NOT produce post reinforcement pause because of variability.
Rate of responding: fast of responding; the larger the ratio requirement, the faster the rate of response
Variable Ratio (VR)
A car sales has to sell 6 cars to receive a bonus at work. Every time he sells 6 cars, he gets a bonus added to his check.
This is an example of:
Fixed Ratio
Slot machines, lottery are examples of:
Variable Ratio (VR)
One of the basic schedules of INT reinforcement where there is constant, set criteria (does not change at all); and specific amount of time elapses before a single correct response produces reinforcement.
The graph has an Scallop/ fish like scales
Fixed Interval (FI)
Pattern of Responding Produced:
Constant, stable rate of response.
Few hesitations between responses.
Rate of Response Produced:
Low-to-moderate rate of response.
The larger the average interval, the lower the overall rate of response.
Variable Interval (VI)
One of the basic schedules of INT reinforcement.
Has changing criteria and a specific amount of times elapses before a single response produces reinforcement.
Ex. Pop Quizzes
Variable Interval
Pattern of Responding:
Post-reinforcement pause, but only during the early part of the interval.
Rate of Responding:
Slow-to-moderate
The larger the fixed interval requirement, the longer the post-reinforcement pause.
Fixed Interval
A car salesman receives a pay check every two weeks as long as he sells one car.
Receiving his paycheck is on what type of schedule:
Fixed Interval (FI)
Gradually increasing the response ratio or the duration of the time interval.
AKA: Schedule Thinning
Ex. CRF to an FR 2 or VR 3
Thinning Intermittent Reinforcement
A result of abrupt increases in ratio requirement when moving from denser (i.e., lot of reinforcement available) to thinner (i.e., less reinforcement available) reinforcement schedules.
Common behavioral characteristics are avoidance, aggression, etc.
Ratio Strain
A variation on basic INT schedules of reinforcement.
Systematically thins each successive reinforcement opportunity independent of the participant’s behavior.
Can be used as a procedure for identifying reinforcers that will maintain treatment effects across increasing schedule requirements.
Progressive Schedules Of Reinforcement
May be used to measure what is commonly referred to as the strength, potency, or effectiveness of scheduled reinforcers.
This schedule shows a direct relation between how hard an organism will work for access to an object, as indexed by the largest ratio (the breaking point) and the potency of the reinforcer.
Progressive Schedules Of Reinforcement
A schedule of reinforcement that provides reinforcement only if the behavior occurs following a specific period of time during which it did not occur or since the last time it occurred.
Helps to decrease behavior that the individual displays too frequently, but NOT TO ELIMINATE IT ENTIRELY.
- Increasing the IRT, to lower the overall rate of responding.*
Differential Reinforcement Of Lower Rates Of Responding (DRL)