Ch. 7 Flashcards
schedule of reinforcement
Is the response requirement that must be met to obtain reinforcement.
In other words, a schedule indicates what exactly has to be done for the reinforcer to be delivered.
continuous reinforcement schedule
is one in which each specified response is reinforced.
An FR 1 schedule of reinforcement can also be called a continuous reinforcement schedule.
intermittent (or partial) reinforcement schedule
is one in which only some responses are reinforced.
Schedule effects
Schedule effects are the different effects on behavior produced by different response requirements.
These are the stable patterns of behavior that emerge once the organism has had sufficient exposure to the schedule.
Such stable patterns are known as steady state behaviors.
Fixed Ratio Schedules
On a fixed ratio (FR) schedule, reinforcement is contingent upon a fixed, predictable number of responses.
For example, on a fixed ratio 5 schedule, a rat has to press the lever five times to obtain a food pellet.
Generally produce a high rate of response along with a short pause following the attainment of each reinforcer.
This short pause is known as a post-reinforcement pause.
For example, a rat on an FR 25 schedule will rapidly emit 25 lever presses, munch down the food pellet it receives, and snoop around the chamber for a few seconds before rapidly emitting another 25 lever presses.
Note, too, that each pause is usually followed by a relatively quick return to a high rate of response.
Thus, the typical FR pattern is described as a “break-and-run” pattern, a short break followed by a steady run of responses.
In general, higher ratio requirements produce longer post-reinforcement pauses.
Schedules in which the reinforcer is easily obtained are said to be
very dense or rich, while schedules in which the reinforcer is difficult to obtain are said to be very lean.
Ex: An FR 12 schedule of reinforcement is denser than an FR 75 schedule.
In general, “stretching the ratio”— moving from a low ratio requirement (a dense schedule) to a high ratio requirement (a lean schedule) should be done gradually.
ratio strain
which is a disruption in responding due to an overly demanding response requirement.
Occurs when you jump from a low ratio requirement to a high ratio requirement
Ratio strain is what most people would refer to as “burnout,”
behavior may become increasingly erratic, with respect to what they’re trying to do. Like studying
variable ratio (VR) schedule,
On a variable ratio (VR) schedule, reinforcement is contingent upon a varying, unpredictable number of responses.
VR schedules generally produce a high and steady rate of response, often with little or no post-reinforcement pause. unlikely to occur when the minimum response requirement in the schedule is very low
As with an FR schedule, an extremely lean VR schedule can result in
ratio strain
Variable ratio schedules help account for the persistence with which some people display certain maladaptive behaviors.(gambling)
ex: Some predatory behaviors, such as that shown by cheetahs, have a strong VR component in that only some attempts at chasing down prey are successful. In humans, only some acts of politeness receive an acknowledgment,
Variable ratio schedules of reinforcement may also facilitate the development of an abusive or exploitative relationship.
At the start of a relationship, the individuals involved typically provide each other with an enormous amount of positive reinforcement (a very dense schedule).
This strengthens the relationship and increases each partner’s attraction to the other.
As the relationship progresses, such reinforcement naturally becomes somewhat more intermittent.
In some situations, however, this process becomes malignant, with one person (the taker) providing reinforcement on an extremely intermittent basis, and the other person (the Giver) working incredibly hard to obtain that reinforcement.
Because the process evolves gradually (a process of slowly “stretching the ratio”), the Giver may have little awareness of what is happening until the abusive pattern is well established.
What would motivate such an unbalanced process?
— One source of motivation is that the less often the Taker reinforces the Giver, the more attention (reinforcement) they receive from the Giver.
In other words, the Giver works so hard to get the Taker’s attention that they actually reinforce the very process of being largely ignored by that partner.
may be relationships in which the partners alternate the role of Giver and Taker.
The result may be a volatile relationship that both partners find exciting but that is constantly on the verge of collapse due to frequent periods in which each partner experiences “ratio strain.”
fixed interval (FI) schedule,
On a fixed interval (FI) schedule, reinforcement is contingent upon the first response after a fixed, predictable period of time.
FI schedules often produce a “scalloped” (upwardly curved) pattern of responding, consisting of a post-reinforcement pause followed by a gradually increasing rate of response as the interval draws to a close.
On a pure FI schedule, any response that occurs during the interval is irrelevant.
Ex: If I have just missed the bus when I get to the bus stop, I know that I have to wait 15 minutes for the next one to come along. Given that it is absolutely freezing out, I snuggle into my parka as best I can and grimly wait out the interval. Every once in a while, though, I emerge from my cocoon to take a quick glance down the street to see if the bus is coming. My behavior of looking for the bus is on a(n) FI 15- min (use the abbreviation) schedule of reinforcement.
I will probably engage in a few glances at the start of the interval, followed a graduallv increasing rate of glancing as time passes
variable interval (VI) schedule,
On a variable interval (VI) schedule, reinforcement is contingent upon the first response after a varying, unpredictable period of time.
VI schedules usually produce a moderate, steady rate of response, often with little or no post-reinforcement pause.
Because VI schedules produce steady, predictable response rates, they are often used to investigate other aspects of operant conditioning, such as those involving choice between alternative sources of reinforcement.
ex: The behavior of checking your cell phone for notifications is maintained on a VI schedule. Our phones provide all sorts of reinforcers, from simple entertainment to meaningful communication with other people, and those little notification alerts pop up every so often. Even if you’ve just checked your phone, to see if any texts have arrived or if someone has liked the photo you posted, you may feel compelled to check it again.
Comparing the Four Basic Schedules
In general, ratio schedules tend to produce a high rate of response. This is because the reinforcer in such schedules is entirely response contingent, meaning that the rapidity with which responses
are emitted does affect how soon the reinforcer is obtained.
On interval schedules, the reinforcer is largely time contingent, meaning that the rapidity with which responses are emitted has little effect on how soon the reinforcer is obtained.
In general, variable schedules produce little or no post-reinforcement pausing because such schedules often provide the possibility of relatively immediate reinforcement, even if one has just obtained a reinforcer.
In general, fixed schedules produce post-reinforcement pauses because obtaining one reinforcer means that the next reinforcer is necessarily quite distant.
FR
Response rate: high
Post-reinforcement pause: yes
VR
Response rate: high
Post-reinforcement pause: no
FI
Response rate: increasing
Post-reinforcement pause: yes
VI
Response rate: moderate
Post-reinforcement pause: no
fixed duration (FD) schedule
On a fixed duration (FD) schedule, the behavior must be performed continuously for a fixed, predictable period of time.
For example, the rat must run in the wheel for 60 seconds to earn one pellet of food (an FD 60-sec schedule).
variable duration (VD) schedule,
On a variable duration (VD) schedule, the behavior must be performed continuously for a varying, unpredictable period of time.
For example, the rat must run in the wheel for an average of 60 seconds to earn one pellet of food, with the required time varying between 1 second and 120 seconds on any particular trial (a VD 60-sec schedule).
How do FD and VD schedules differ from FI and VI schedules?
On a VD schedule, reinforcement is contingent upon responding continuously for a varying period of time; on an FI schedule, reinforcement is contingent upon the first response after a fixed period of time.
Although duration schedules are sometimes useful in modifying certain human behaviors, such as studying, they are in some ways rather imprecise compared to the four basic schedules.
With FR schedules, for example, one knows precisely what was done to achieve the reinforcer, namelv, a certain number of responses.
On an FD schedule, however, what constitutes “continuous performance of behavior” during the interval could vary widely.</sub
—Ex: Julie’s son might read only a few pages during his two-hour study session or charge through several chapters; in either case, he would receive the reinforcer of being allowed to watch television.
—Remember reinforcing the mere performance of an activity with no regard to level of performance might undermine a person’s intrinsic interest in that activity. This danger obviously applies to duration schedules; one therefore needs to be cautious in their use.
response-rate schedule,
in a response-rate schedule, reinforcement is directly contingent upon the organism’s rate of response.
three types of response-rate schedules.
— differential reinforcement of high rates (DRH)
—differential reinforcement of low rates (DRL)
—differential reinforcement of paced responding (DRP)