Schedules of Reinforcement Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Now that we have discussed
reinforcement . . .
* It is time to discuss how reinforcements can and
should be delivered
* If you were going to reinforce your puppy for going
to the bathroom outside, how would you do it?
* Would you give him a doggie treat every time?
Some of the time?
* Would you keep doing it the same way or would
you change your method as you go along?

A

Now that we have discussed
reinforcement . . .
* It is time to discuss how reinforcements can and
should be delivered
* If you were going to reinforce your puppy for going
to the bathroom outside, how would you do it?
* Would you give him a doggie treat every time?
Some of the time?
* Would you keep doing it the same way or would
you change your method as you go along?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Schedules of reinforcement

A
  • A schedule of reinforcement is the response
    requirement that must be met in order to obtain
    reinforcement.
  • Each particular kind of reinforcement schedule
    tends to produce a particular pattern and rate of
    performance
  • In other words, it is what you have to do to get the
    reward!
  • Example: Does a dog have to roll over just once to get a
    reward, or does he have to roll over more than once
    before he’s given his reward?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Continuous

A
  • Continuous – A continuous reinforcement schedule
    (CRF) is one in which each specified response is
    reinforced
  • Example: every time the dog rolls over he gets a treat
    every time a child hangs up her coat she gets praised
  • Useful for strengthening newly learned behaviors or
    when using shaping procedures to train a behavior.
  • Leads to rapid increases in the rate of the behavior
    (begins to occur very frequently).
  • Not very common in a natural environment.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q
  • Intermittent
A
  • Intermittent
  • An intermittent reinforcement schedule is one in which
    only some responses are reinforced (not every
    response)
  • Example: every third time the dog rolls over he gets
    reinforced.
  • Useful for maintaining behaviors that are already
    established
  • They can be based on the number of responses made
    (ratio) or the time between reinforcement (interval)
  • They can also be fixed or variable.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Types of intermittent schedules

A
  • Ratio Schedules
  • Reinforcement given
    after a number of nonreinforced responses
  • Fixed Ratio
  • Variable Ratio
  • Interval Schedules
    reinforcement
  • given for a responses
    that occurs after a
    certain amount of time
    has passed
  • Fixed Interval
  • Variable Interval
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q
  • Fixed ratio schedule (FR) -
A
  • Fixed ratio schedule (FR) - reinforcement is given after a
    fixed number of nonreinforced responses (predictable)
  • Examples:
  • FR4 schedule - a salesperson receives a bonus after
    every 4 sales
  • FR1 schedule - take a break after reading a chapter in
    the text
  • FR50 schedule - a rat received a food pellet after every
    50 bar presses.
  • “piecework” - paid by number of pieces sewn together
  • Schedules can be dense (e.g., FR5) or lean (e.g.,FR 50)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Fixed Ratio

A

Fixed Ratio
* Characteristic pattern
* Short pause following each reinforcer
* Higher ratio requirements produce longer pauses after
reinforcement
* e.g., FR50 has longer break before responding again than
FR25
* Can stretch the reinforcement ratio (e.g., FR1, FR2, FR4,
FR6, FR10)
* Ratio strain – when requirement increases too quickly
behavior becomes erratic or disrupted
* Movement from “dense” to “lean” schedules should be done gradually.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Variable ratio

A
  • Variable Ratio (VR): Reinforcer given after variable
    amount of non-reinforced responses (less predictable)
  • VR10 schedule, on average every 10 responses are
    reinforced but number of responses might vary
    between 1 and 20
  • Examples
  • VR6 schedule - a gambling machine pays off every 6
    spins on average, but payoff trial cannot be predicted
  • VR50 schedule - a food pellet is dispensed on average
    every 50 bar-presses, but exact trial cannot be
    predicted
  • Salesperson working on commission
  • Characteristic pattern:
  • High and steady rate of response
  • Little or no postreinforcer pausing (especially when
    minimum requirement is low)
  • Behaviors on this type of schedule tend to be very
    persistent
  • This includes unwanted behaviors like begging,
    gambling
  • “Stretching the ratio” “schedule thinning” means
    starting out with a very dense, rich reinforcement
    schedule and gradually decreasing the amount of
    reinforcement
  • The spouse, gambler, or child must work harder
    and harder to get the reinforcer
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Fixed interval

A
  • Fixed Interval (FI): Reinforcement obtained on first
    response after a fixed, predictable period of time
  • Example
  • FI 2min – a rat receives food on the first lever press
    following a 2 minute interval
  • FI75min - glancing at the clock during class. After
    75 minute interval, you are rewarded by being
    allowed to leave.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

FI

A
  • Characteristic Pattern:
  • “scallop pattern” – FI schedules produce an
    upwardly curved rate of responding with increased
    rate of responding as the interval nears its end
  • Example: study more and more as a test
    approaches.
  • Noticeable post-reinforcement pause
  • Example: don’t study much after a test has just
    occurred
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Variable interval

A
  • Variable Interval (VI): Reinforcer given for the first response
    after a varying, unpredictable amount of time
  • VI 30 sec schedule- on average the first response after every
    30 seconds is reinforced but the time of reinforcement
    might vary between 1 sec & 1 min
  • Examples
  • VI 2min - a food pellet is dispensed on the first bar-press
    following a 2 minute interval (on average) but exact time
    bar-press cannot be predicted
  • VI 15min – Hilary’s boyfriend, Michael, gets out of school
    and turns on his phone some time between 3:00 and 3:30
    (the aver age is after 15 minutes) – the “reward ” of his
    answering his phone puts her calling behavior on a VI
    schedule, so she calls every few minutes until he answers
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

VI

A
  • Characteristic Pattern:
  • Moderate steady rate of response
  • little or no post-reinforcement pause
  • Example: Presses of the “redial” button on the
    telephone are sustained at a steady rate when you
    are trying to reach your parents and get a “busy”
    signal on the other end of the line
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Schedules of Reinforcement

A
  • Fixed-ratio
  • FIXED NUMBER of
    responses
  • Variable-ratio
  • VARYING NUMBER of
    responses
  • Usually an average
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Schedules of Reinforcement

A
  • Fixed-interval
  • FIXED TIME has passed since last reward
  • Variable-interval
  • VARYING AMOUNTS OF TIME has passed
    since last reward
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

F-R: Fixed Number
V-R: Varying Number
F-I: Fixed Time
V-I: Varying Time

A

F-R: Fixed Number
V-R: Varying Number
F-I: Fixed Time
V-I: Varying Time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q
  • Students’ visits to the university library show a decided increase
    in rate as the time of final examinations approaches.
    (Fixed Interval schedules - produce an accelerated rate of response as the
    time of reinforcement approaches FI-16 week schedule)
  • Every time you put money in the vending machine you receive
    your candy bar.
  • (Fixed Ratio 1 - schedule — same as continuous reinforcement!!)
  • Fred has a boss who checks on his work periodically (usually
    roughly every 2 hours). Because Fred doesn’t know exactly when
    the next ‘check-up’ might come, he generally works hard at all
    times in order to be ready.
  • (Variable Interval 2 hour Schedule )
  • You have to email your friend Bob about 3 times before he’ll
    email you back. After your third email on average, though, he
    usually responds.
  • (Variable ratio)
A
  • Students’ visits to the university library show a decided increase
    in rate as the time of final examinations approaches.
    (Fixed Interval schedules - produce an accelerated rate of response as the
    time of reinforcement approaches FI-16 week schedule)
  • Every time you put money in the vending machine you receive
    your candy bar.
  • (Fixed Ratio 1 - schedule — same as continuous reinforcement!!)
  • Fred has a boss who checks on his work periodically (usually
    roughly every 2 hours). Because Fred doesn’t know exactly when
    the next ‘check-up’ might come, he generally works hard at all
    times in order to be ready.
  • (Variable Interval 2 hour Schedule )
  • You have to email your friend Bob about 3 times before he’ll
    email you back. After your third email on average, though, he
    usually responds.
  • (Variable ratio)
17
Q

Limited Hold (/LH)

A

Deadline for when the schedule must be
accomplished by
* Finite time after a reinforcer becomes available that
a response will produce it.
– FI/LH
– VI/LH

18
Q

Limited Hold

A
  • Short limited holds – similar results to ratio
    schedules
  • For small FIs, FI/LH produce results similar to FR
    schedules
  • Variable Interval, Limited hold – similar results to VR
    schedules
  • Used when want ratio-like behavior, but unable to
    count each instance of behavior
19
Q

Duration Schedule

A
  • Reinforcement occurs after the behavior has been
    engaged in for a continuous period of time.
  • Fixed Duration (FD) – the period is fixed
  • Variable-Duration (VD) – interval changes unpredictably
  • Used only when target behavior can be measured
    continuously