Chapter 13 - Ratio Schedules Flashcards
The delivery of a reinforcer is scheduled for every occurrence of the behavior.
Continuous Schedule of Reinforcement
A continuous schedule is best used during the beginning stages of learning to create a strong association between the behavior and the response
The delivery of a reinforcer is scheduled to be withheld after occurrences of the behavior.
Extinction Schedule
A reinforcer is scheduled to follow only some occurrences of the behavior
Intermittent Schedules
When you deliver reinforcement for a fixed number of responses.
This schedule produces a pattern of pausing after reinforcement followed by a high rate until reinforcement.
Fixed-Ratio Schedule
Example:
Scott Bucks
Pokemon Go - On the fixed ratio schedule, users know that if they catch enough Pokemon, they will level up or possess enough candy to evolve.
When you deliver reinforcement for a varying number of responses.
This schedule produces a pattern of high and uniform responding.
- Use ratio schedules to increase resistance to extinction and to reduce problems from satiation.
- Do not use ratio schedules for shaping or when you are concerned about ratio strain.
- Ratio schedules are intermittent schedules.
Variable-Ratio Schedule
the first response is rewarded only after a specified amount of time has elapsed. This schedule causes high amounts of responding near the end of the interval but much slower responding immediately after the delivery of the reinforcer.
shows a GRADUAL increase in rate. It does not show the abrupt increase of the fixed-ratio schedule.
A fixed-interval schedule means that reinforcement becomes available after a specific period.
A common misunderstanding is that reinforcement is automatically delivered at the end of this interval, but this is not the case. Reinforcement only becomes available to be delivered and would only be given if the target behavior is emitted at some stage after the time interval has ended.
Fixed-Interval Schedule
Reinforcing the first response after a varying period of time from the prior reinforcement.
when a response is rewarded after an unpredictable amount of time has passed. This schedule produces a slow, steady rate of response.
There is an absence of certainty concerning the next event.
Variable-Interval Schedule
Example: Just like the notion of a “pop quiz” constitutes a variable-interval schedule where the certainty of a quiz exists but the uncertainty of its timing remains.
or checking emails.
Ratio schedules may be used to reduce the problem of the person becoming SATIATED on the reinforcer and then no longer responding.
True
Both fixed-ratio and variable-ratio schedules of the same size are equally good at opposing satiation.
True
When shaping a new response, you must consistently reinforce each approximation. The fastest way to increase its rate compared to nonapproximations is to reinforce it every time that it occurs.
Continuous Schedule for Shaping (Disadvantage of Ratio Schedules)
Fixed-interval schedules produce a SCALLOP pattern of responding.
Fixed interval scalloping
A pause after reinforcement followed by a gradual increase in rate prior to the next reinforcement.
Scallop
Both patterns of fixed-interval and fixed-ratio alternate between responding and resting.
True
4 characteristics of interval schedules (intermittent)
- they produce lower response rates than ratio schedules
- they increase resistance to extinction and reduce problems from satiation.
- they are not useful for shaping
- they may lead to too little reinforcement and thus something like ratio strain.
If a person is reinforced after a fixed period of time no matter what he or she is doing at the time, it is called….
Superstitious Reinforcement