Chapter 10 Flashcards

1
Q

Define and give an example of intermittent reinforcement.

A

Intermittent reinforcement is an arrangement in which a behavior is positively reinforced intermittently rather than every time it occurs. Jan’s problem-solving behavior was not reinforced a er each math problem that she solved. Instead, she received reinforcement a er a fixed number of problem-solving responses had occurred. On this reinforcement schedule, Jan worked at a very steady rate.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Define and give an example of response rate.

A

Response rate refers to the number of instances of a behavior that occur in a given period of time. In this book, rate and frequency are synonymous unless otherwise indicated. Following common usage of terms in behavior modification, we will especially use the term “rate” when speaking of schedules of reinforcement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Define and give an example of schedule of reinforcement.

A

A schedule of reinforcement is a rule specifying which occurrences of a given behavior, if any, will be reinforced.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define CRF and give an example that is not in this chapter.

A

The simplest schedule of reinforcement is continuous reinforcement (CRF), which is an arrangement in which each instance of a particular response is reinforced. Had Jan received reinforcement for each problem solved, we would say that she was on a CRF schedule. Many behaviors in everyday life are reinforced on a CRF schedule. Each time you turn the tap, your behavior is reinforced by water. Each time you insert and turn your key in the front door of your house or apartment, your behavior is reinforced by the door opening.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Describe four advantages of intermittent reinforcement over CRF for maintaining behavior.

A

Intermittent schedules of reinforcement have several advantages over CRF for maintaining behavior:

(a) The reinforcer remains effective longer because satiation takes place more slowly;
(b) behavior that has been reinforced intermittently tends to take longer to extinguish (see Chapter 8);
(c) individuals work more consistently on certain intermittent schedules; and
(d) behavior that has been reinforced intermittently is more likely to persist a er being transferred to reinforcers in the natural environment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Explain what an FR schedule is. Describe the details of two examples of FR schedules in everyday life (at least one of which is not in this chapter).

A

In a fixed-ratio (FR) schedule, a reinforcer occurs each time a fixed number of responses of a particular type are emitted. e reinforcement schedules for Jan were FR schedules. Recall that early in her program, she had to complete two math problems per reinforcement, which is abbreviated FR 2. Later she had to solve four problems per reinforcement, which is abbreviated FR 4. Finally, she had to make 16 correct responses, abbreviated FR 16. Note that the schedule was increased in steps. If Jan’s responses had been put on FR 16 immediately, without the intervening FR values, her behavior might have deteriorated and appeared as though it were on extinction. is deterioration of responding from increasing an FR schedule too rapidly is called ratio strain. e optimal response requirement differs for different individuals and for different tasks. For example, Jan increased her response rate even when the FR increased to 16. Other students may have shown a decrease before reaching FR 16. In general, the higher the ratio an individual is expected to perform at, the more important it is to approach it gradually through exposure to lower ratios. e optimal ratio value that will maintain a high rate of response without producing ratio strain must be found by trial and error.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is a free-operant procedure? Give an example.

A

A free-operant procedure is one in which the individual is “free” to respond at various rates in the sense that there are no constraints on successive responses. For example, if Jan had been given a worksheet containing 12 math problems to solve, she could have worked at a rate of one problem per minute, or a rate of three per minute, or at some other rate.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is a discrete-trials procedure? Give an example.

A

In a discrete-trials procedure, the individual is “not free” to respond at whatever rate he or she chooses because the environment places limits on the availability of response opportunities. For example, if a parent told a teenage son, “You can use the family car a er you have helped do the dishes following three evening meals,” then that would be a discrete-trials procedure. The teenager cannot do the dishes for three quick meals in the next hour, but has to wait and respond at a maximum rate of doing the dishes once per day. When we talk about the characteristic effects of schedules of reinforcement on response rate in this book, we are referring to free-operant procedures unless otherwise specified.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are three characteristic effects of an FR schedule?

A

When introduced gradually, FR schedules produce a high steady rate until reinforcement occurs, followed by a postreinforcement pause.

The length of the postreinforcement pause depends on the value of the FR—the higher the value, the longer the pause.

FR schedules also produce high resistance to extinction.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

*What is ratio strain?

A

Ratio strain refers to the deterioration of responding from increasing an FR schedule too rapidly.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Explain what a VR schedule is. Describe the details of two examples of VR schedules in everyday life (at least one of which is not in this chapter). Do your examples involve a free-operant procedure or a discrete-trials procedure?

A

With a variable-ratio (VR) schedule, a reinforcer occurs a er a certain number of a particular response, and the number of responses required for each reinforcer changes unpredictably from one reinforcer to the next. The number of responses required for each reinforcement in a VR schedule varies around some mean value, and this value is specified in the designation of that particular VR schedule. Suppose, for example, that over a period of several months, a door-to-door salesperson averages one sale for every 10 houses called on. is does not mean that the salesperson makes a sale at exactly every 10th house. Sometimes a sale might have been made a er calling on a total of five houses. Sometimes sales might occur at two houses in a row. And sometimes the salesperson might call on a large number of houses before making a sale. Over several months, however, a mean of 10 house calls is required to produce reinforcement. A VR schedule that requires an average of 10 responses is abbreviated VR 10. VR, like FR, produces a high steady rate of responding. However, it also produces no or a minimal postreinforcement pause (Schlinger et al., 2008). e salesperson can never predict exactly when a sale will occur and is likely to continue making house calls right a er a sale. Three additional differences between the effects of VR and FR schedules are that the VR schedule can be increased somewhat more abruptly than an FR schedule without producing ratio strain, the values of VR that can maintain responding are somewhat higher than FR, and VR produces a higher resistance to extinction than FR schedules of the same value.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

*Describe how a VR schedule is similar procedurally to an FR schedule. Describe how it is different procedurally.

A

In both FR and VR, reinforcement occurs after a certain number of responses are emitted. In FR the number is fixed; in VR it varies.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

*What are three characteristic effects of a VR schedule?

A

(a) a high, steady rate of response; (b) no post-reinforcement pause; and (c) a high resistance to extinction.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Describe two examples of how FR or VR might be applied in training programs. (By training program, we refer to any situation in which someone deliberately uses behavior principles to increase and maintain someone else’s behavior, such as parents to influence a child’s behavior, a teacher to influence students’ behavior, a coach to influence athletes’ behavior, an employer to influence employees’ behavior, etc.) Do your examples involve a free- operant or a discrete-trials procedure?

A

“Example 1: Jennifer’s parents want her to do her chore of mowing the lawn, so they give her $10 once she has mowed the lawn 3 times. This is FR 3, and a discrete-trial procedure.
Example 2: Jake had hand surgery and is now learning how to use his hand again. To help him gain use of his fingers again, he is to turn the knob on a gumball machine. On average, 1 in 10 gumballs in the machine are black, and the black gumball gets him a toy. This is VR 10, and free-operant.”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

*What is an FI schedule?

A

A reinforcer is presented following the first instance of a specific response following a fixed period of time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What are two questions to ask when judging whether a behavior is reinforced on an FI schedule? What answers to those questions would indicate that the behavior is reinforced on an FI schedule?

A

When judging whether a behavior is reinforced on an FI schedule, you should ask yourself two questions:

(a) Does reinforcement require only one response a er a fixed interval of time?
(b) Does responding during the interval affect anything? If you answer yes to the first question and no to the second question, your example is an FI.

17
Q

Suppose that a professor gives an exam every Friday. e students’ studying behavior would likely resemble the characteristic pattern of an FI schedule in that studying would gradually increase as Friday approaches, and the students would show a break in studying (similar to a lengthy postreinforcement pause) a er each exam. But this is not an example of an FI schedule for studying. Explain why.

A

Consider, for example, a college class in which students have a test on the same day of the week. The students’ pattern of studying likely resembles the characteristic pattern of responding on an FI schedule, in that little or no studying occurs immediately a er a test and studying increases as the test day draws near. However, consider the preceding two questions. Can the students wait until just before a week has passed, make one study response, and receive a good grade? No, a good grade is contingent on studying during the one-week interval. Does responding before the interval ends affect anything? Yes, it contributes to a good grade. Therefore, this is not an example of FI, although it may resemble it in some ways.

18
Q

*What is a VI schedule?

A

A reinforcer is presented following the first instance of a specific response after an interval of time, and the length of the interval changes unpredictably from one reinforcer to the next.

19
Q

*Explain why simple interval schedules are not often used in training programs.

A

(a) FI generates relatively long pauses at the beginning of the interval;
(b) VI typically generates lower response rates than ratio schedules; and
(c) simple interval schedules require continuous monitoring of behavior.

20
Q

Explain what an FR/LH schedule is. Describe the details of an example from everyday life that is not in this chapter.

A

Suppose that a fitness instructor says to someone exercising, “If you do 30 sit ups then you can get a drink of water.” at would be an FR 30 schedule. Now suppose that the fitness instructor says to the person, “If you do 30 sit ups in 2 minutes, then you can get a drink of water.” at would be an example of an FR 30 schedule with a limited hold of 2 minutes. e addition of a limited hold to a schedule is indicated by writing the abbreviation of the schedule followed by “/LH” and the value of the limited hold. e previous example would be written as an FR 30/LH 2 minutes. Because ratio schedules already generate high rates of response, it is not common for a limited hold to be added to ratio schedules.

21
Q

Explain what an FI/LH schedule is. Describe the details of an example that is not in this chapter. (Hint: Think of behaviors that occur at certain fixed times, such as arriving for meals, plane departures, and cooking.)

A

In the natural environment, a good approximation of an FI/LH schedule is waiting for a bus. Buses usually run on a regular schedule. An individual may arrive at the bus stop early, just before the bus is due, or as it is arriving—it makes no difference, for that person will still catch the bus. So far, this is just like a simple FI schedule. However, the bus will wait only a limited time—perhaps 1 minute. If the individual is not at the bus stop within this limited period of time, the bus goes on and the person must wait for the next one.

22
Q

*Describe how an FI/LH schedule is procedurally similar to a simple FI schedule. Describe how it procedurally differs.

A

In both FI and FI/LH, reinforcement is programmed to occur after the first response occurring after a fixed period of time. In FI/LH, however, the reinforcer remains available only for a limited period of time, rather than indefinitely, after it has been “set up.”

23
Q

Explain what a VI/LH schedule is. Describe the details of an example that occurs in everyday life (that is not in this chapter).

A

A good approximation of behavior on a VI/LH schedule occurs when we are telephoning a friend whose line is busy. Note that as long as the line is busy, we will not get through to our friend no matter how many times we dial, and we have no way of predicting how long the line will be busy. However, a er finishing the call, our friend may receive another call. In either case, if we do not call during one of the limited periods in which the line is free, we miss the reinforcement of talking to our friend and must wait another unpredictable period before we again have an opportunity to gain this particular reinforcement.

24
Q

Describe an example of how VI/LH might be applied in a training program.

A

We’ll explain how a VI/ LH works by describing an effective strategy for managing the behavior of kids on a family car trip. It’s based on the timer game,2 also known as the good behavior game. When the sons of one of the authors were children, family car trips were trying. With Mom and Dad in the front seat and the boys in the back seat, nonstop bickering between the boys seemed to rule the day (“You’re on my side,” “Give me that,” “Don’t touch me,” etc.). A er several unpleasant car trips, Mom and Dad decided to try a variation of the timer game. First, they purchased a timer that could be set at values up to 30 minutes and produced a “ding” when the set time ran out. (Today parents could use a cell phone or other electronic device.) At the beginning of the car trip, they announced the rules to the boys:

Here’s the deal. Every time this timer goes “ding,” if you’re playing nicely, you can earn 5 extra minutes for watching late-night TV in the motel room [a powerful reinforcer for the boys in the days before there were DVD players in vehicles, or portable laptop computers]. But if you’re bickering, you lose those 5 minutes. We’ll play the game until we get there.

Thereafter, a parent set the timer ranging from 1- to 30-minute intervals for the duration of the trip. Because, on average, the timer was set for 15 minutes, this was a VI 15-minute schedule. Because the boys had to be cooperative, the instant that a “ding” occurred, the limited hold was zero seconds and this was a VI 30 minutes/LH 0 seconds schedule. e results seemed miraculous.

25
Q

*For each of the photos in Figure 10.3, identify the schedule of reinforcement that appears to be operating.

A

The waiting for luggage photo illustrates a VI/LH. The child and the peg board photo illustrate an FR schedule. Watching TV illustrates a VI/LH schedule. Taking the clothes out of the dryer illustrates an FI schedule. The rationale for these choices is indicated in the description below each of the photos.

26
Q

Explain what an FD schedule is. Describe the details of two examples of FD schedules that occur in everyday life (at least one of which is not in this chapter).

A

In a fixed-duration (FD) schedule, a reinforcer is presented only if a behavior occurs continuously for a fixed period of time. The value of the FD schedule is the amount of time that the behavior must be engaged in continuously before reinforcement occurs (e.g., if it is 1 minute, we call the schedule an FD 1-minute schedule). A number of examples of fixed-duration schedules occur in the natural environment. For instance, a worker who is paid by the hour might be considered to be on an FD schedule.

Melting solder might also be an example of behavior on an FD schedule. To melt the solder, one must hold the tip of the soldering iron on the solder for a continuous fixed period of time. If the tip is removed, the solder cools quickly and the person has to re-apply heat for the same continuous period.

27
Q

*Suppose that each time that you put bread in a toaster and push the lever down, 30 seconds passes before your toast is ready. Is this an example of an FD schedule? Why or why not? Would it be an FD schedule if (a) the catch that keeps the lever down doesn’t work or (b) the timer that releases it doesn’t work? Explain in each case.

A

It is not an FD schedule because there is no behavior that must occur continuously for 30 seconds. Rather, it is a fixed interval schedule in that, after 30 seconds, the behavior of taking the bread from the toaster will be reinforced (the toast will be ready). It might be an FD schedule if you had to hold the level down continuously for 30 seconds because of the catch not working. It would not be an FD if the catch that keeps the level down works but the timer that releases it doesn’t. Rather, in this latter case, it would be an FI/LH. After 30 seconds, the response of releasing the level would pay off, and if you waited too long, your toast would be burnt.

28
Q

*Explain why FD might not be a very good schedule for reinforcing study behavior.

A

It is difficult to monitor the time spent in productive studying behavior (relative to monitoring, for example, the number of problems completed or the number of pages read).

29
Q

Describe two examples of how FD might be applied in training programs.

A

“For example, a physical education teacher might use an FD schedule to strengthen performance of various exercises. The children who participate in the exercise program continuously during the fixed duration could be reinforced by an opportunity to participate in a game of their choice. Any two plausible applications of FD are acceptable.”

30
Q

Explain what a VD schedule is. Describe the details of an example of one that occurs in everyday life (that is not in this chapter).

A

In a variable-duration (VD) schedule, a reinforcer is presented only if a behavior occurs continuously for a fixed period of time and the interval of time from reinforcer to reinforcer changes unpredictably. The mean interval is specified in the designation of the VD schedule. For example, if the mean is 1 minute, the schedule is abbreviated VD 1-minute. An example of a VD schedule might be rubbing two sticks together to produce fire, because the amount of time this takes varies as a function of factors such as the size, shape, and dryness of the sticks. Another example of a VD schedule is waiting for traffic to clear before crossing a busy street.

31
Q

What are concurrent schedules of reinforcement? Describe an example.

A

At home during a particular evening, for example, a student might have the choice of watching a TV show, watching an online movie, surfing the Net, texting, doing homework, or talking on the phone. When each of two or more behaviors is reinforced on different schedules at the same time, the schedules of reinforcement that are in effect are called concurrent schedules of reinforcement.

32
Q

If an individual has an option of engaging in two or more behaviors that are reinforced on different schedules by different reinforcers, what four factors in combination are likely to determine the response that the person will make?

A

Research has indicated, in addition to reinforcement rate, the factors likely to influence one’s choice when several schedules are available are

(a) the types of schedules that are operating;
(b) the immediacy of reinforcement;
(c) the magnitude of reinforcement (e.g., a student might choose to study for an exam worth 50% of the final grade over watching a boring TV show); and
(d) the response effort involved in the different options. Attempts have been made to extend or modify the matching law to incorporate these other factors influencing choice.

33
Q

*Describe how intermittent reinforcement works against those who are ignorant of its effects. Give an example.

A

The inconsistent use of extinction, resulting in highly persistent undesirable behavior, is a common misuse of intermittent reinforcement. For example, if a parent sometimes ignores a child’s tantrums, and sometimes gives in, tantrums are being reinforced intermittently and will become extremely persistent.

34
Q

Name six schedules of reinforcement commonly used to develop behavior persistence (i.e., the ones described in Table 10.1).

A
  1. Fixed ratio (FR) schedule
  2. Variable-ratio (VR) schedule
  3. Fixed-interval schedule with a limited hold (FI/LH) schedule
  4. Variable-interval with a limited hold schedule (VI/LH)
  5. Fixed-duration (FD) schedule
  6. Variable-duration (VD) schedule
35
Q

*In general, which schedules tend to produce higher resistance to extinction (RTE), the fixed or the variable schedules (see Table 10.1)?

A

The variable schedules.