Chapter 8 Flashcards
Define and give an example of intermittent reinforcement.
Intermittent reinforcement: an arrangement in which a behaviour is positively reinforced only occasionally rather than every time it occurs.
Example: Jan’s problem solving behaviour not being reinforced after each math problem solved
Define and give an example of response rate
The number of instances of a behaviour that occur in a given period of time
Example: Jan solves 5 problems in 20 minutes
Define and give an example of schedule of reinforcement
A rule specifying which occurrences of a given behaviour, if any, will be reinforced
Example: you will get water every time you turn on your tap - this is specifically an example of a continuous schedule of reinforcement. You can also have intermittent or, if no reward, extinction.
Define CRF and give an example that is not in this chapter
CRF: continuous reinforcement: the simplest schedule of reinforcements; when each instance of a particular response is reinforced. Example: every time you turn on a light switch, the lights turn on.
Describe four advantages of intermittent reinforcement over CRF for maintaining behaviour
- Reinforcer remains effective longer because satiation take place more slowly
- Behaviour that has been reinforced intermittently tend to take longer to extinguish
- Individuals work more consistently on certain intermittent schedules
- Behaviour that has been reinforced intermittently is more likely to persist after being transferred to reinforcers in the natural environment
Explain what an FR Schedule is. Give 2 examples of FR schedule in everyday life
Fixed ration (FR) schedule: a reinforcer occurs each time a fixed number of response of a particular type are emitted. If you are trying to improve behaviour, you can increase the ratio: reward as the person’s behaviour improves Example 1: getting rewards with a gummy bear after you read one page from your textbook. Example 2: (from textbook) Jan gets rewarded after completing 2 math problems without exhibiting inattentive behaviour. Then after she proves she can do that it moves up to 4 problems, then 16 before she gets a reward Example 3: getting paid by the number of vegetable picked Example 4: having to do 20 pushups in order to get a water break at football practice -A reinforcer occurs each time a fixed number of responses of a particular type are emitted -For example, each time a team member does 10 push-ups, the coach gives him a pat on the back -For example, each time a team member makes a 3 point shot, the rest of the team members cheer for him
What is a free operant procedure? Give example
One in which the individual is ‘free’ to respond at various rates in the sense that there are no constraints on the successive responses
Example: if Jan is giving a work sheet containing 12 math problems, she can complete these at whatever rate she desires
What is discrete trials procedure? Give an example
Individual is not free to respond at whatever rate he or she chooses because the environment places limits on the availability of reps ones opportunities
Example: if a teenager is told he can take the car out 3 different times after he washes the dishes that day. In this situation the teenager cannot do the dishes for three quick meals in an hour but has to wait and response at a maximum rate of doing dishes once per day (example from text)
What are three characteristic effects of an FR schedule
- FR schedules produce a consistent response rate
- The rate of responding increases with higher FR schedules.
- Post reinforcement pause— following reinforcement, responding will temporarily stop. After the pause, the rate of responding resumes to pre-reinforcement levels.
Explain what a VR schedule is. Illustrate with two examples of VR schedules in everyday life (one not from textbook). Do your examples involve a free operant procedure or a discrete trials procedure?
Variable ratio (VR) schedule: a reinforcer occurs after a certain number of a particular response, and the number of responses required for each reinforcer changes unpredictably from one reinforce to the next, Example: door to door salesman: although they may make a sale on average once every ten houses they could make the sale twice in a row or else not for twenty houses. This would be a free operant procedure as the salesperson would be free to move at their own pace Example 2: a hockey player shooting on net. They might get every shot in, thus scoring, or they might not get a goal after ten shots, and every ratio in between,
Explain what a FI/LH schedule is, and illustrate with an example that is not in this chapter.
FI/LH: a fixed interval with a limited hold: a fixed interval schedule is one in which a reinforcer is presented following the first instance of a specific response after a fixed period of time. A limited hold is a deadline for meeting the response requirement of a schedule of reinforcement. In other words it’s a fixed interval schedule with a time limit.
Example: waiting for a bus —> they usually run on a regular schedules. If they person arrives early or as the bus is arriving, they will catch it. If they arrive late they have missed the deadline and will miss it. (from textbook)
Explain what a VI/LH schedule is. Illustrate with two examples from everyday life (at least one of which is not in this chapter)
VI/LH: Variable interval limited hold
Variable interval schedule: one in which a reinforcer is presented following the first instance of a specific response after an interval of time, and the lengthy of the interval changes unpredictably form one reinforcer to the next
Example 1: if you check your email every approximately 25 minutes you can expect to be rewarded with a new email. It may vary trial to trial
Variable interval schedule with limited hold: a schedule in which you are reinforced if you are doing the desired behaviour when the hold limit is reached.
In other words you don’t know when the reward will be available, and thus must try the desired behaviour almost continuously if you want the chance of receiving the reward. If you wait to little or too long to carry out the behaviour you may miss the limited period of time in which the reward is available example from text: the timer game: a timer is set to go off on average once every fifteen minutes, however can go off anywhere from 1-30 minutes after the previous ding. Parents installed the timer in their car for a car trip, and told the kids if they were fighting when the ding went off they would lose 5 minutes of TV watching when they got to the destination, and would gain 5 minutes if they were cooperating
Example 2: from text: if you try to call a friend but the line is busy. The period when the line would be free would be called limited hold. Because we don’t know when or for how long the line will be free, we must try calling repeatedly if we are to have any hope of getting through.
Example 3: trying to get your hands on the ‘IT toy” at Christmas (think newly released Wii or some sort of equivalent), those things were scarce around the city,and if you wanted to get one, you had to constantly check stores to see if they got any in, because the stores often didn’t know until they got them, there was no way to predict when the reward would be available and you had to just keep trying.
Give two examples of how VI/LH might be applied in training programs
- A variation of the timer game (mentioned in number 4 of the above question) can be used to get consistent good behaviour from children in classrooms, car trips, or really any longer lasting event
- Because the VI/LH is a great schedule to use to try and get continuous behaviour it could be applied to any such program. You could give a person trying to quit smoking a beeper, if when the beeper goes off, they are not currently smoking they could get rewarded - this is another modification of the Timer Game
Explain what an FD schedule is. Illustrate with 2 examples of FD schedule that occur in everyday life (at least one of which is not in this chapter)
Fixed Duration (FD) schedule: one in which a reinforcer is presented only if a behaviour occurs continuously for a fixed period of time Example from text: being paid by the hour Example 2 from text: melting metal - heat must be continuously applied for X amount of time if the metal is to melt, stopping at any point will allow it to cool Example 3: driving - foot has to be on the pedal for the duration of the time you want to be moving, if you remove it at any point you will stop
What are concurrent schedules of reinforcement? Give an example.
Concurrent schedules of reinforcement: the schedules of reinforcement that are in effect when each of two or more behaviours is reinforced on different schedules at the same time.
Example: watching TV, while talking on the phone and surfing the net. You could be rewarded for any behaviour at any point in time - thus those three reward schedules are overlapping and concurrent.