Chapter 7: Schedules and Theories of Reinforcement Flashcards
Schedule of Reinforcement
The response requirement that must be met to obtain reinforcement.
Continuous Reinforcement Schedule (or CRF)
Each specified response is reinforced.
Which reinforcement schedule is helpful for when behavior is first being shaped?
Continuous or CRF
Intermittent (or Partial) Reinforcement Schedule
Only some responses are reinforced.
How many types of Intermittent (or Partial) reinforcement schedules are there? What are the different types?
- Fixed Ratio
- Variable Ratio
- Fixed Interval
- Variable Interval
Steady-State Behaviors
Stable response patterns that emerges once the organism has had considerable exposure to the schedule.
Schedule Effects
The different effects on behavior produced by different response requirements.
Fixed Ratio (FR) Schedule
Reinforcement is contingent upon a fixed, predictable number of responses.
FR1 is the same as…
Continuous Reinforcement
Fixed ratio schedules typically produce a (high/low) rate of response along with a (long/short) pause following the attainment of each reinforcer.
high
short
Short pause is also known as…
Post-reinforcement Pause
Why are FR Schedules sometimes referred to as “Break-and-Run”?
Because each time a reinforcement is reached, the organism will take a break before proceeding with their goal to reach the next reinforcement.
In Fixed Ratio schedules, _____ ratio requirements result in ____ breaks.
high
long
An easy FR schedule can be defined as…
dense or rich
A difficult FR schedule can be defined as…
lean
Stretching the Ratio
Moving a low ratio requirement (dense) to a high ratio requirement (lean)
Should “stretching the ratio” be done gradually or quickly? Why?
Gradually; If the change is made too quickly, behavior may become erratic or die out quickly.
Ratio Strain
A disruption in responding due to an overly demanding response requirement.
Ratio strain is more commonly known as…
burnout
Variable Ratio (VR) Schedule
Reinforcement is contingent upon a varying unpredictable number of responses.
FR# means
The number (#) of responses needed to obtain reinforcement.
VR# means
The average number (#) of responses needed to obtain reinforcement.
Variable ratio schedules typically produce a (low/high) and (steady/disrupted) rate of response, often with (little or no/long) post-reinforcement pauses.
high
steady
little or no
What makes gamblers a perfect example for variable ratio schedules?
Although gamblers lose significant amounts of money in gambling, they are reinforced through their intermittent and unpredictable winnings.
What makes VR have a high rate of behavior?
It’s unpredictability.
As with an FR schedule, an extremely learn VR schedule can result in _______ ________.
ratio strain
Fixed Interval (FI) Schedule
Reinforcement is contingent upon the first response after a fixed, predictable period of time.