Chapter 10 Flashcards
Define and give an example of intermittent reinforcement.
Intermittent reinforcement is an arrangement in which a behavior is positively reinforced intermittently rather than every time it occurs. Jan’s problem-solving behavior was not reinforced a er each math problem that she solved. Instead, she received reinforcement a er a fixed number of problem-solving responses had occurred. On this reinforcement schedule, Jan worked at a very steady rate.
Define and give an example of response rate.
Response rate refers to the number of instances of a behavior that occur in a given period of time. In this book, rate and frequency are synonymous unless otherwise indicated. Following common usage of terms in behavior modification, we will especially use the term “rate” when speaking of schedules of reinforcement.
Define and give an example of schedule of reinforcement.
A schedule of reinforcement is a rule specifying which occurrences of a given behavior, if any, will be reinforced.
Define CRF and give an example that is not in this chapter.
The simplest schedule of reinforcement is continuous reinforcement (CRF), which is an arrangement in which each instance of a particular response is reinforced. Had Jan received reinforcement for each problem solved, we would say that she was on a CRF schedule. Many behaviors in everyday life are reinforced on a CRF schedule. Each time you turn the tap, your behavior is reinforced by water. Each time you insert and turn your key in the front door of your house or apartment, your behavior is reinforced by the door opening.
Describe four advantages of intermittent reinforcement over CRF for maintaining behavior.
Intermittent schedules of reinforcement have several advantages over CRF for maintaining behavior:
(a) The reinforcer remains effective longer because satiation takes place more slowly;
(b) behavior that has been reinforced intermittently tends to take longer to extinguish (see Chapter 8);
(c) individuals work more consistently on certain intermittent schedules; and
(d) behavior that has been reinforced intermittently is more likely to persist a er being transferred to reinforcers in the natural environment.
Explain what an FR schedule is. Describe the details of two examples of FR schedules in everyday life (at least one of which is not in this chapter).
In a fixed-ratio (FR) schedule, a reinforcer occurs each time a fixed number of responses of a particular type are emitted. e reinforcement schedules for Jan were FR schedules. Recall that early in her program, she had to complete two math problems per reinforcement, which is abbreviated FR 2. Later she had to solve four problems per reinforcement, which is abbreviated FR 4. Finally, she had to make 16 correct responses, abbreviated FR 16. Note that the schedule was increased in steps. If Jan’s responses had been put on FR 16 immediately, without the intervening FR values, her behavior might have deteriorated and appeared as though it were on extinction. is deterioration of responding from increasing an FR schedule too rapidly is called ratio strain. e optimal response requirement differs for different individuals and for different tasks. For example, Jan increased her response rate even when the FR increased to 16. Other students may have shown a decrease before reaching FR 16. In general, the higher the ratio an individual is expected to perform at, the more important it is to approach it gradually through exposure to lower ratios. e optimal ratio value that will maintain a high rate of response without producing ratio strain must be found by trial and error.
What is a free-operant procedure? Give an example.
A free-operant procedure is one in which the individual is “free” to respond at various rates in the sense that there are no constraints on successive responses. For example, if Jan had been given a worksheet containing 12 math problems to solve, she could have worked at a rate of one problem per minute, or a rate of three per minute, or at some other rate.
What is a discrete-trials procedure? Give an example.
In a discrete-trials procedure, the individual is “not free” to respond at whatever rate he or she chooses because the environment places limits on the availability of response opportunities. For example, if a parent told a teenage son, “You can use the family car a er you have helped do the dishes following three evening meals,” then that would be a discrete-trials procedure. The teenager cannot do the dishes for three quick meals in the next hour, but has to wait and respond at a maximum rate of doing the dishes once per day. When we talk about the characteristic effects of schedules of reinforcement on response rate in this book, we are referring to free-operant procedures unless otherwise specified.
What are three characteristic effects of an FR schedule?
When introduced gradually, FR schedules produce a high steady rate until reinforcement occurs, followed by a postreinforcement pause.
The length of the postreinforcement pause depends on the value of the FR—the higher the value, the longer the pause.
FR schedules also produce high resistance to extinction.
*What is ratio strain?
Ratio strain refers to the deterioration of responding from increasing an FR schedule too rapidly.
Explain what a VR schedule is. Describe the details of two examples of VR schedules in everyday life (at least one of which is not in this chapter). Do your examples involve a free-operant procedure or a discrete-trials procedure?
With a variable-ratio (VR) schedule, a reinforcer occurs a er a certain number of a particular response, and the number of responses required for each reinforcer changes unpredictably from one reinforcer to the next. The number of responses required for each reinforcement in a VR schedule varies around some mean value, and this value is specified in the designation of that particular VR schedule. Suppose, for example, that over a period of several months, a door-to-door salesperson averages one sale for every 10 houses called on. is does not mean that the salesperson makes a sale at exactly every 10th house. Sometimes a sale might have been made a er calling on a total of five houses. Sometimes sales might occur at two houses in a row. And sometimes the salesperson might call on a large number of houses before making a sale. Over several months, however, a mean of 10 house calls is required to produce reinforcement. A VR schedule that requires an average of 10 responses is abbreviated VR 10. VR, like FR, produces a high steady rate of responding. However, it also produces no or a minimal postreinforcement pause (Schlinger et al., 2008). e salesperson can never predict exactly when a sale will occur and is likely to continue making house calls right a er a sale. Three additional differences between the effects of VR and FR schedules are that the VR schedule can be increased somewhat more abruptly than an FR schedule without producing ratio strain, the values of VR that can maintain responding are somewhat higher than FR, and VR produces a higher resistance to extinction than FR schedules of the same value.
*Describe how a VR schedule is similar procedurally to an FR schedule. Describe how it is different procedurally.
In both FR and VR, reinforcement occurs after a certain number of responses are emitted. In FR the number is fixed; in VR it varies.
*What are three characteristic effects of a VR schedule?
(a) a high, steady rate of response; (b) no post-reinforcement pause; and (c) a high resistance to extinction.
Describe two examples of how FR or VR might be applied in training programs. (By training program, we refer to any situation in which someone deliberately uses behavior principles to increase and maintain someone else’s behavior, such as parents to influence a child’s behavior, a teacher to influence students’ behavior, a coach to influence athletes’ behavior, an employer to influence employees’ behavior, etc.) Do your examples involve a free- operant or a discrete-trials procedure?
“Example 1: Jennifer’s parents want her to do her chore of mowing the lawn, so they give her $10 once she has mowed the lawn 3 times. This is FR 3, and a discrete-trial procedure.
Example 2: Jake had hand surgery and is now learning how to use his hand again. To help him gain use of his fingers again, he is to turn the knob on a gumball machine. On average, 1 in 10 gumballs in the machine are black, and the black gumball gets him a toy. This is VR 10, and free-operant.”
*What is an FI schedule?
A reinforcer is presented following the first instance of a specific response following a fixed period of time.