Chapter 7: Schedules and Theories of Reinforcement Flashcards
Schedule of Reinforcement
The response requirement that must be met to obtain reinforcement.
Continuous Reinforcement Schedule (or CRF)
Each specified response is reinforced.
Which reinforcement schedule is helpful for when behavior is first being shaped?
Continuous or CRF
Intermittent (or Partial) Reinforcement Schedule
Only some responses are reinforced.
How many types of Intermittent (or Partial) reinforcement schedules are there? What are the different types?
- Fixed Ratio
- Variable Ratio
- Fixed Interval
- Variable Interval
Steady-State Behaviors
Stable response patterns that emerges once the organism has had considerable exposure to the schedule.
Schedule Effects
The different effects on behavior produced by different response requirements.
Fixed Ratio (FR) Schedule
Reinforcement is contingent upon a fixed, predictable number of responses.
FR1 is the same as…
Continuous Reinforcement
Fixed ratio schedules typically produce a (high/low) rate of response along with a (long/short) pause following the attainment of each reinforcer.
high
short
Short pause is also known as…
Post-reinforcement Pause
Why are FR Schedules sometimes referred to as “Break-and-Run”?
Because each time a reinforcement is reached, the organism will take a break before proceeding with their goal to reach the next reinforcement.
In Fixed Ratio schedules, _____ ratio requirements result in ____ breaks.
high
long
An easy FR schedule can be defined as…
dense or rich
A difficult FR schedule can be defined as…
lean
Stretching the Ratio
Moving a low ratio requirement (dense) to a high ratio requirement (lean)
Should “stretching the ratio” be done gradually or quickly? Why?
Gradually; If the change is made too quickly, behavior may become erratic or die out quickly.
Ratio Strain
A disruption in responding due to an overly demanding response requirement.
Ratio strain is more commonly known as…
burnout
Variable Ratio (VR) Schedule
Reinforcement is contingent upon a varying unpredictable number of responses.
FR# means
The number (#) of responses needed to obtain reinforcement.
VR# means
The average number (#) of responses needed to obtain reinforcement.
Variable ratio schedules typically produce a (low/high) and (steady/disrupted) rate of response, often with (little or no/long) post-reinforcement pauses.
high
steady
little or no
What makes gamblers a perfect example for variable ratio schedules?
Although gamblers lose significant amounts of money in gambling, they are reinforced through their intermittent and unpredictable winnings.
What makes VR have a high rate of behavior?
It’s unpredictability.
As with an FR schedule, an extremely learn VR schedule can result in _______ ________.
ratio strain
Fixed Interval (FI) Schedule
Reinforcement is contingent upon the first response after a fixed, predictable period of time.
FI# means
The specific amount of time (#) needed to pass before behavior is reinforced.
Fixed interval schedules often produce a _________ pattern of responding, consisting of a post-reinforcement pause followed by a gradually (increasing/decreasing) rate of response as the interval draws to a close.
“scalloped”
increasing
Variable Interval (VI) Schedules
Reinforcement is contingent upon the first response after a varying, unpredictable period of time.
VI# means
The average amount of time (#) needed to pass before behavior is reinforced.
Variable interval schedules usually produce a (low/moderate/high), steady rate of response, often with little or no post-reinforcement pause.
moderate
(Ratio/Interval) schedules produce higher rates of response because the schedule is entirely ______ contingent.
Ratio
contingent
(Fixed/Variable) schedules produce little to no post-reinforcement pause because such schedules often provide the possibility of relatively _______ reinforcement.
Variable
immediate
Duration Schedule
Reinforcement is contingent on performing a behavior continuously throughout a period of time.
Fixed Duration (FD) Schedule
The behavior must be performed continuously for a fixed, predictable period of time.
FD# means
The amount of time (#) a behavior must be performed before getting a reinforcement.
Variable Duration (VD) Schedule
The behavior must be performed continuously for a varying, unpredictable period of time.
VD# means
The average amount of time (#) a behavior must be performed before getting a reinforcement.
How are duration schedules different than interval schedules?
Duration schedules rely on continuous behavior being performed for a certain amount of time where as interval schedules expect behavior to be performed after a certain amount of time.
On a pure FI schedule, any response that occurs (during/following) the interval is irrelevant.
during
On _______ schedules, the reinforcer is largely time contingent, meaning that the rapidity with which responses are emitted has (little/considerable) effect on how soon the reinforcer is obtained.
interval
little
In general ______ schedules produce post-reinforcement pauses because obtaining one reinforcer means that the next reinforcer is necessarily quite (distant/near).
fixed
distant
What are the three types of response-rate schedules?
1) Differential reinforcement of high rates (DRH)
2) Differential reinforcement of low rates (DRL)
3) Differential reinforcement of paced responding (DRP)
What is a response-rate schedule?
A reinforcement that is directly contingent upon the organism’s rate of response.
What is a “con” of during a duration schedule?
Duration schedules sometimes undermine the behavior similar to getting awarded for participating vs quality of performance.
Differential Reinforcement of High Rates (DRH)
Reinforcement is contingent upon emitting at least a certain number of responses in a certain period of time.
OR
Reinforcement is provided for responding at a fast rate.
Differential Reinforcement of Low Rates (DRL)
A minimum amount of time must pass between each response before the reinforcer will be delivered.
OR
Reinforcement is provided for responding at a slow rate.
Differential Reinforcement
One type of response is reinforced while another is not.
How do FI schedules differ than DRL schedules?
Responses that occur in an FI schedule do not impact the reinforcement, however, in a DRL schedule, an interruption in the duration will prevent the reinforcement from occurring.
What is an example of a DRL?
Teaching a kid to brush their teeth slowly so they learn the correct technique and to prevent sloppiness.
What is an example of a DRH?
Racing to the finish line.
Differential Reinforcement of Paced Responding (DRP)
Reinforcement is contingent upon emitting a series of responses at a set rate.
OR
Reinforcement is provided for responding neither too fast nor too slow.
What is an example of DRP?
Non-competitive running.
On a (VD/VI) schedule, reinforcement is contingent upon responding continuously for a varying period of time; on an (FI/FD) schedule, reinforcement is contingent upon the first response after a fixed period of time.
VD
FI
As Tessa sits quietly in the doctor’s office, her mother occasionally gives her a hug as a reward. As a result, Tessa is more likely to sit quietly on future visits to the doctor. This is an example of a(n) ________ ________ schedule of reinforcement.
variable duration
In practicing slow-motion exercise known as tai chi, Tung noticed that the more slowly he moved, the more thoroughly his muscles relaxed. This is an example of (DRL/DRH/DRP).
DRL or Differential Reinforcement of Low Rates
Non-Contingent Schedule of Reinforcement
A response is not required for the reinforcer to be obtained.
Non-Contingent Schedule of Reinforcement is also known as
Response-Independent Schedules
What are the two types of non-contingent reinforcement schedules?
1) Fixed Time Schedule
2) Variable Time Schedule
Fixed Time (FT) Schedule
The reinforcer is delivered following a fixed, predictable period of time, regardless of the organisms behavior.
FT schedules deliver ______ reinforcers.
“free”
FT# means…
After # time passes, the reinforcer is delivered.
Variable Time (VT) Schedule
The reinforcer is delivered following a varying, unpredictable period of time, regardless of the organisms behavior.
VT# means…
After the average of # time passes, the reinforcer is delivered.
How do FT/VT schedules differ from FI/VI schedules?
FT and VT schedules do not require the organism to emit a specific response as FI and VI schedules do.
Non-contingent schedules can accidentally reinforce a behavior performed (before/after) the reinforcer is provided.
before
A superstition can arise when a behavior is reinforced on a _____________ schedule.
non-contingent
Who are two examples that are more prone to the development of superstitions? Why?
Gamblers and professional athletes because they associate a specific behavior with a big win and do not want to lose their status.