Ch. 8 Flashcards

1
Q

Extinction

A

Extinction is the non reinforcement of a previously reinforced response, the result of which is a decrease in the future strength of that response.

As with the classical conditioning version of extinction, the term extinction in operant conditioning refers to both a procedure and a process.

The procedure of extinction is the nonreinforcement of a previously reinforced response; the process of extinction is the resultant decrease in response strength

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

An error students sometimes make is to use the term extinction when referring ___

A

to any decrease in the strength of a behavior.

This is incorrect.

The term extinction should be used only when referring to a decrease in the future strength of a behavior that is caused by the removal of the reinforcer for that behavior.

A behavior can be weakened or eliminated in several ways, extinction being only one of them.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

A behavior can be weakened or eliminated in several ways, extinction being only one of them.

For example, we can get a rat to stop pressing a lever for food by

A

(1) no longer giving it food when it presses the lever,

(2) giving it a shock when it presses the lever. or

(3) giving it lots of free food ahead of time so that it is no longer hungry.

Only the first is an example of extinction; the second is an example of positive punishment; and the third is an example of an abolishing operation (specifically a satiation procedure) which reduces the value of the reinforcer.

We could also stop the rat from pressing the lever by simply removing the lever from the chamber, in the same way that parents might take away a child’s iPad to stop them from playing with it. That too is not an example of extinction because it simply prevents the behavior from occurring.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Side Effects of Extinction

A

When an extinction procedure is implemented, it is often accompanied by certain side effects.

It is important to be aware of these side effects because they can mislead one into believing that an extinction procedure is not having an effect when in fact it is.

Note, too, that these side effects can be inadvertently strengthened if one suddenly gives in and provides the subject with the sought-after reinforcer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Extinction Burst

A

The implementation of an extinction procedure does not always result in an immediate decrease in responding. Instead, one often finds an extinction burst, a temporary increase in the frequency and intensity of responding when extinction is first implemented.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Increase in Variability

A

An extinction procedure can also result in an increase in the variability of a behavior.

For example, a rat whose lever pressing no longer produces food might vary the manner in which it presses the lever. If the rat typically pressed the lever with its right paw, it might now try pressing it with its left paw.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Emotional Behavior

A

Extinction is often accompanied by emotional behavior.

The hungry pigeon that suddenly finds that key pecking no longer produces food soon becomes agitated (as evidenced, for example, by quick jerky movements and wing flapping).

Likewise, people often become upset when confronted by a candy machine that does not deliver the goods.

Such emotional responses are what we typically refer to as frustration.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Aggression.

A

One type of emotional behavior that is particularly common during an extinction procedure is aggression.

In fact, extinction procedures have been used to study aggressive behavior in animals.

For example, research has shown that a pigeon whose key pecking for food is placed on extinction will reliably attack another pigeon (or model of a pigeon) that happens to be nearby

Extinction-induced aggression (also called frustration-induced aggression) is also common in humans.

People often become angry with those who block them from obtaining an important goal.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Resurgence

A

A rather unusual side effect of extinction is resurgence, the reappearance during extinction of other behaviors that had previously (sometime in the past) been effective in obtaining reinforcement.

Resurgence resembles the psychoanalytic concept of regression, which is the reappearance of immature behavior in reaction to frustration or conflict.

Thus, someone whose partner largely ignores them might begin spending increasing amounts of time at their parents’ house. Faced with the lack of reinforcement in a romantic relationship, they return to a setting that once provided a rich source of reinforcement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Depression.

A

Extinction can also lead to depressive-like symptoms.

For example, Klinger, Barta, and Kemble (1974) had rats run down an alleyway for food and immediately followed this with an assessment of the rats’ activity level in an open field test.

Thus, each session consisted of two phases:

(1) running down an alleyway for food, followed by

(2) placement in an open area that the rats could freely explore.

When extinction was implemented on the alleyway task, activity in the open field test first increased to above normal (a sort of generalized extinction burst), then decreased to below normal, followed by a return to normal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Resistance to Extinction

A

is the extent to which responding persists after an extinction procedure has been implemented.

A response that is very persistent is said to have high resistance to extinction, while a response that disappears quickly is said to have low resistance to extinction.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

partial reinforcement effect

A

The schedule of reinforcement is the most important factor influencing resistance to extinction.

According to the partial reinforcement effect, behavior that has been maintained on an intermittent (partial) schedule of reinforcement will extinguish more slowly than behavior that has been maintained on a continuous schedule.

Resistance to extinction is particularly strong when behavior has been maintained on a variable interval (VI) or variable ratio (VR) schedule.

One way of thinking about the partial reinforcement effect is that the less frequent and less predictable the reinforcer, the longer it takes the person or animal to “discover” that reinforcement is no longer available

A less mentalistic interpretation is that there is a much greater contrast between a CRF schedule and extinction than between a VR 100 schedule and extinction.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

The partial reinforcement effect helps account for certain types of __

A

annoying or maladaptive behaviors that are difficult to eliminate.

Dogs that beg for food are often extremely persistent.

Paradoxically, as mentioned earlier, this is sometimes the result of previously unsuccessful attempts at extinction. Imagine, for example, that all family members agree to stop feeding the dog at the dinner table.

If one person nevertheless slips the dog a morsel when it is making a particularly big fuss, the begging will become both more intense and more persistent.

This means that the next attempt at extinction will be even more difficult.

Of course, the partial reinforcement effect also suggests a possible solution to this problem.

If behavior that has been continuously reinforced is less resistant to extinction, it might help to first spend several days reinforcing each instance of begging.

Then, when extinction is implemented, the dog’s tendency to beg might extinguish more rapidly

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

History of reinforcement

A

In general, the more reinforcers that an individual has received for a behavior, the greater the resistance to extinction.

a child who has only recently picked up the habit of whining for candy should stop whining relatively quickly when the behavior is placed on extinction, as opposed to a child who has been at it for several months.

From a practical perspective, this means it is much easier to extinguish an unwanted behavior, such as whining for candy, when it first becomes evident (hence the saying, “nip it in the bud”).

There is, however, a limit in the extent to which further reinforcers will produce increased resistance to extinction.

Furomoto (1971), for example, found that resistance to extinction for key pecking in pigeons reached its maximum after about 1,000 reinforcers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Magnitude of the Reinforcer

A

The magnitude of the reinforcer can also affect resistance to extinction.

For example, large-magnitude reinforcers sometimes result in greater resistance to extinction than small-magnitude reinforcers.

Thus, lever pressing might take longer to extinguish following a training period in which each reinforcer consisted of a large pellet of food than if the reinforcer were a small pellet of food.

Lever pressing might also take longer to extinguish if the reinforcer was a highly preferred food item than if it were a less-preferred food item.

From a practical perspective, this means that a dog’s behavior of begging at the dinner table might extinguish more easily if you first spend several days feeding it small bites of less-preferred morsels.

Unfortunately, one problem with this strategy is that the effect of reinforcer magnitude on resistance to extinction is not entirely consistent.

In fact, researchers sometimes find that smaller reinforcers result in greater resistance to extinction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Degree of Deprivation

A

Not surprisingly, the degree to which an organism is deprived of a reinforcer also affects resistance to extinction.

In general, the greater the level of deprivation, the greater the resistance to extinction.

A rat that is only slightly hungry will cease lever pressing more quickly than a rat that is very hungry. This suggests yet another strategy for extinguishing a dog’s tendency to beg at the table: feed the dog before the meal.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Previous Experience with Extinction

A

When sessions of extinction are alternated with sessions of reinforcement, the greater the number of exposures to extinction, the quicker the behavior will extinguish during subsequent exposures.

For example, if a rat experiences several sessions of extinction randomly interspersed with several sessions of reinforcement, it will eventually learn to stop lever pressing soon after the start of an extinction session.

The rat has learned that if it has not received reinforcement soon after the start of a session, it is likely that no reinforcement will be forthcoming for the remainder of the session.

This also leads to the prediction that people who have been through relationship breakups on numerous occasions will more quickly get over such breakups, when compared to people who have had fewer breakup experiences.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Distinctive Signal for Extinction

A

Extinction occurs more quickly when there is a distinctive stimulus that signals the onset of extinction.

such a stimulus is called a discriminative stimulus for extinction

19
Q

In general, a behavior that has been reinforced many times is likely to be ___ to extinguish.

Resistance to extinction is generally greater when the behavior that is being
extinguished has been reinforced with a ___ -magnitude reinforcer, though the
opposite effect has also been found.

In general, there is a(n) ___ relationship between resistance to extinction and the organism’s level of deprivation for the reinforcer.

Previous experience with extinction, as well as a distinctive signal for extinction, tends to produce a(n) ___ in resistance to extinction.

A
  1. more difficult
  2. high
  3. direct
  4. decrease
20
Q

Spontaneous Recovery

A

is the reappearance of an extinguished response, despite the continued absence of reinforcement, following a rest period after extinction.

with each recovery usually being weaker and more readily extinguished than the previous one.

Following several extinction sessions, we will eventually reach the point at which spontaneous recovery does not occur (apart from a few tentative lever presses every once in a while). and the behavior will have essentiallv been eliminated

Skinner (1950) proposed that spontaneous recovery may be a function of discriminative stimuli (Ss) associated with the start of the session.

21
Q

Differential Reinforcement of Other Behavior

A

which is the reinforcement of any behavior other than the target behaviour that is being extinguished.

DRO procedures tend to be more effective than simple extinction procedures because the target behavior is weakened both by

(1) the lack of reinforcement for that behavior and

(2) the reinforcement of alternative behaviors that come to replace it.

are not only efficient but can also reduce many of the unwanted side effects of extinction, such as frustration and aggression.

As a general rule, therefore, whenever one attempts to extinguish an unwanted behavior, one should also provide plenty of positive reinforcement for more appropriate behavior

22
Q

functional communication training (or differential reinforcement of functional communication).

A

the behavior of clearly and appropriately communicating one’s desires is differentially reinforced.

23
Q

Stimulus Control

A

which means that the presence of the discriminative stimulus reliably affects the probability of the behavior (or reliably “evokes” the behavior).

24
Q

stimulus generalization

A

is the tendency for an operant response to be emitted in the presence of a stimulus that is similar to an.

In general, the more similar the stimulus, the stronger the response.

25
Q

generalization gradient,

A

tendency to generalize across different stimuli is called a generalization gradient, which represents the strength of responding in the presence of stimuli that are similar to the and that vary along a continuum.

a flat gradient indicates more generalization, while a steep gradient indicates less generalization.

26
Q

stimulus discrimination

A

As in classical conditioning, the opposite of stimulus generalization in operant conditioning is stimulus discrimination, the tendency for an operant response to be emitted more in the presence of one stimulus than another.

More generalization means less discrimination, and less generalization means more discrimination.

Thus, a steep gradient indicates weak generalization and strong discrimination across stimuli, whereas a flat gradient indicates strong generalization and weak discrimination across stimuli.

27
Q

Discrimination training

A

as applied to operant conditioning, involves the reinforcement of responding in the presence of one stimulus (the SD) and not another stimulus.

the latter is called a discriminative stimulus for extinction (S🔺), which is a stimulus that signals the absence of reinforcement.

28
Q

The ability to discriminate

A

can also be tested using a matching-to-sample procedure in which an animal is first shown a sample stimulus and then is required to select that stimulus out of a group of alternative stimuli.

The extent to which the animal is able to select the correct stimulus is regarded as an indicator of its ability to discriminate between the two stimuli.

29
Q

The Peak Shift Effect

A

the peak of a generalization gradient following discrimination training will shift from the SD to a stimulus that is further removed from the S🔺 (Hanson, 1959).

This constitutes an exception to the general principle that the strongest response in a generalization gradient occurs in the presence of the original SD.

with discrimination training, the gradient drops off more sharply on the side toward the S🔺, which simply means that there’s going to be a strong discrimination between the S🔺and the SD.

Before discrimination training, the strongest response occurs to the SD (the 2000-Hz tone). Following discrimination training (the bottom panel), the strongest response shifts away from the
SD to a stimulus that lies in a direction opposite to the S🔺 (in this case, it shifts to a 2200-Hz tone). This shift in the peak of the generalization gradient is what is called the peak shift effect.

One explanation for the peak shift effect is that during discrimination training, subjects respond in terms of the relative, rather than the absolute values, of stimuli

30
Q

peak shift effect example

A

Suppose that “Mr. Shallow” categorizes potential dates entirely on the basis of how extraverted versus introverted they are.

Jackie, with whom he had a very boring relationship, was an introvert (an S🔺)

Dana, with whom he had a wonderfully exciting relationship, was an extravert (an SD), while He then moves to a new city and begins scrolling through dating websites to find someone new.

According to the peak shift effect, he will likely seek out a woman who is even more extraverted than Dana.

31
Q

multiple schedule

A

multiple schedule consists of two or more independent schedules presented in sequence, each resulting in reinforcement and each having a distinctive SD.

Note that a multiple schedule differs from a chained schedule in that a chained schedule requires that all of the component schedules be completed before the sought-after reinforcer is delivered.

On a multiple schedule (as well as a chained schedule), stimulus control is demonstrated when the subject responds differently in the presence of the SDs associated with the different
schedules.

32
Q

Behavioral contrast

A

An interesting phenomenon that can be investigated using multiple schedules is behavioral contrast.

Behavioral contrast occurs when a change in the rate of reinforcement on one component of a multiple schedule produces an opposite change in the rate of response on another component.

In other words, as the rate of reinforcement on one component changes in one direction, the rate of response on the other component changes in the opposite direction.

There are two basic contrast effects: positive and negative.

33
Q

negative contrast effect,

A

In a negative contrast effect, an increase in the rate of reinforcement on one component produces a decrease in the rate of response on the other component.

Suppose, for example, that a pigeon first receives several sessions of exposure to a multiple VI 60-sec VI 60-sec schedule (leaving out the abbreviations):

Red-PeckVI60
—>Food /Green:PeckV160
—> Food Red PeckV160 —> …

Because both schedules are the same, the pigeon responds equally on both the red key and the green key. Following this, the VI 60-sec component on the red key is changed to VI 30-sec, which provides a higher rate of reinforcement (on average, two reinforcers per minute as opposed to one reinforcer per minute):

Red-PeckVI30
—>Food /Green:PeckV130
—> Food Red PeckV130 —> …

With more reinforcement now available on the red key, the pigeon will decrease its rate of response on the green key, which is associated with the unchanged VI 60-sec component. Simply put, because the first component in the sequence is now more attractive, the second component seems relatively less attractive.

34
Q

positive contrast effect,

A

In a positive contrast effect, a decrease in rate of reinforcement on one component results in an increase in rate of response on the other component.

The situation is analogous to the person whose partner has become less caring and affectionate at home; as a result, they spend more time flirting with people at work. The people at work seem relatively more attractive compared to the dud they have at home

Positive contrast effects are also evident when the change in one component of the multiple schedule involves not a decrease in the rate of reinforcement but instead the implementation of a punisher, such as a mild electric shock.

As the one alternative suddenly becomes punishing, the remaining alternative, which is still reinforcing, is viewed as even more attractive

This might explain what happens in some volatile relationships in which couples report strong overall feelings of affection for each other. The intermittent periods of aversiveness seem to heighten the couple’s appreciation of each other during periods of affection.

Such relationships can therefore thrive, given that the positive aspects of the relationship significantly outweigh the negative aspects

35
Q

Remember that with positive and negative contrast

A

We are concerned with how changing the rate of reinforcement on the first component of a multiple schedule affects the rate of responding on the second component.

The rate of responding will, of course, also change on the first component because the schedule of reinforcement on that component has changed, but that is not surprising.

What is surprising is the change in response rate on the second component, even though the schedule of reinforcement in that component has remained the same.

Thus, it is the change in response rate on the second component that is the focus of concern in behavioral contrast.

36
Q

anticipatory contrast

A

in which the rate of response varies inversely with an upcoming (“anticipated”) change in the rate of reinforcement.

In other words, faced with the impending loss of reinforcement, the pigeons responded all the more vigorously for reinforcement while it was still available.

37
Q

Contrast effect example

A

The occurrence of these contrast effects indicates that behaviors should not be viewed in isolation.

Consequences for behavior in one setting can greatly affect the strength of behavior in another setting.

Consider, for example, a young girl who is increasingly neglected at home, perhaps because her parents are going through a divorce.

She might try to compensate for this circumstance by seeking more attention at school (a positive contrast effect), perhaps to the point of misbehaving.

Although her parents might blame the school for her misbehavior, she is in fact reacting to the lack of reinforcement at home.

Thus, to borrow a concept from
humanistic psychology, behavior needs to be viewed in a holistic manner, with the recognition that behavior in one setting can be influenced by contingencies operating in other
settings.

38
Q

While discrimination training is an effective way for establishing stimulus control, it has its limitations.

A

For example, during the process of learning to discriminate an SD from an
S🔺, the subject will initially make several “mistakes” by responding in the presence of the S🔺

Because such responses do not result in reinforcement, the subject may become frustrated and display a great deal of emotional behavior.

It would be helpful, therefore, if there were a method of discrimination training that minimized these effects.

39
Q

Errorless discrimination training

A

is a gradual training procedure that minimizes the number of errors (i.e., nonreinforced responses to the S🔺) and reduces many of the adverse effects associated with discrimination training. It involves two aspects:

(1) The S🔺 is introduced early in training, soon after the animal has learned to respond appropriately to the SD, and

(2) the S🔺, is presented in weak form to begin with and then gradually strengthened.

This type of discrimination training is also likely to produce behavior patterns that are difficult to modify at a later point in time.

For this reason, errorless procedures may be most useful in rote learning of basic facts, such as arithmetic and spelling, in which the substance of what is learned is unlikely to change.

With material that requires greater flexibility, however, such as that typically found in most college-level courses, errorless learning might be a significant impediment

40
Q

fading.

A

The process of gradually altering the intensity of a stimulus.

For example, one can fade in music by presenting it faintly to begin with and gradually turning up the volume, or fade out music by presenting it loudly to begin with and gradually turning down the volume.

41
Q

targeting

A

In targeting, one trains an animal to approach and touch a particular object.

Targeting is commonly used to manage animals in zoos. By simply moving the target stick, zookeepers can lead the animals from one cage to another or position them precisely for medical examinations.

Animals can also be taught to target a point of light from a laser beam, which allows the handler to send the animal to a spot some distance away.
This can be a useful procedure for directing search-and-rescue dogs in disaster areas that are difficult for the handler to traverse

Research is also underway to determine if dogs are capable of detecting viral infections, including Covid-19

42
Q

Stimulus control can also be used to eliminate certain types of problem behaviors in animals.

Pryor (1999), for example, describes how she once experienced considerable difficulty in training a dolphin to wear suction cups over its eyes (as part of an intended demonstration of the dolphin’s ability to swim solely by sonar).

A

Although the cups did not hurt, the dolphin refused to wear them and would cleverly sink to the bottom of the pool for several minutes whenever it saw Pryor approaching with the cups.

Initially stumped, Pryor finally hit on the idea of reinforcing the behavior of sinking by giving the dolphin a fish whenever it did so (which, she reports, seemed to greatly surprise the dolphin).

Soon, the dolphin was sinking at high frequency to earn fish, at which point Pryor began to reinforce the behavior only after a cue had been presented.

In short order, the dolphin was sinking only on cue, meaning that the behavior was now under strong stimulus control.

Pryor found that she was able to reintroduce the suction cups and place them on the dolphin without difficulty.

In the absence of the cue for sinking, the dolphin no longer had a tendency to sink to avoid the cups.

43
Q

habit

A

A habit is an operant behavior that is under strong stimulus control and seems to automatically occur in certain settings.