Schedules of Reinforcement and Choice Behaviours Flashcards
schedules of reinforcement
indicates what has to be done for the reinforcer to be delivered; occurrence is followed by the reinforcer
- reinforcer delivery can depend on
1. presence of certain stimuli
2. passage of time
3. number of responses
4. etc.
- produce predictable patterns of B
- influences how instrumental responses are learned/maintained by reinforcement
Why are schedules of reinforcement important?
determine:
- rate of instrumental behavior
- pattern of instrumental behavior
- persistence of instrumental behavior
schedules of reinforcement: schedule effects
- highly relevant to motivation of B
- whether person is industrious/lazy has little to do with personality
- has more to do with reinforcement schedule
in the real world, instrumental responses _____ get reinforced each time they occur.
what is the name of this concept?
rarely
intermittent schedules of reinforcement
simple schedule of reinforcement
- single factor determines which occurrence of the instrumental response is reinforced
- i.e. how many responses have occurred
- i.e. how much time has passed before the target response can be reinforced
schedules of reinforcement: ratio
- depend only on the number of responses, time is irrelevant
- reinforcement depends only on number of responses subject has to perform
- reinforcement = delivered each time the set number of responses = reached
ratio schedules: CRF (continuous reinforcement)
- each response results in delivery of reinfrocer
- often part of contingency management programs for drug addiction rehabs
- e.g. clean urine = money reward
- e.g. entering correct ATM pin = let’s you withdraw cash
- this is the only schedule where reinforcement is NOT intermittent
ratio schedules: partial/intermittent
- responding reinforced only sometimes
- enter correct ATM pin, BUT receive “out of order” sign
cumulative record
- way of representing how a response is repeated over time
- shows total (cumulative) number of responses that have occurred up to a particular point in time
ratio run
high and steady rate of responding that completes each ratio requirement
ratio strain
- If the ratio requirement is suddenly increased (e.g., from FR 120 to FR 500), the animal is likely to pause periodically before completion of the ratio requirement
in extreme cases: ratio strain may be so great that animal stops responding altogether
avoiding ratio strain
must be careful not to raise ratio requirement too quickly in approaching the desired FR response requirement
what is the best FR ratio when strengthening a new response?
CRF - FR1
disadvantages of FR1
- satiety and reduced effort due to it being so easy
- time + resource consuming
what is the best approach regarding fixed ratio schedules in order to learn a B?
- moving from a low ratio requirement (a dense schedule) to a high ratio requirement (a lean schedule).
- should be done gradually to avoid “ratio strain” or burnout.
At higher ratios, you can _________ the response at a higher/faster level.
At higher ratios, you can increase the response at a higher/faster level.
fixed-ratio schedule
- reinforcer earned at specific, predictable response instance in a sequence of responses
- e.g. 10 responses per reinforcer = FR10
- e.g. entering correct cell number (response) = FR; reaching the person = reinforcer
- e.g. being paid per item manufactured in a factory
- delivering a quota of 50 flyers (response); being paid (reinforcer) = FR50
fixed-ratio schedule: cumulative record
- total nb of responses that have occurred up to a particular point in time or within specific interval
- complete visual record of when + frequency of subject response during a session
fixed-ratio schedule: post-reinforcement pause
- 0 rate of responding that typically occurs just after reinforcement on FR
- controlled by the upcoming ratio requirement (nb of responses)
- should be called pre-ratio pause: see an intimidating task ahead = you will pause
variable ratio (VR)
- unpredictable amount of effort, nb of responses, required to earn the reinforcement
- e.g. pigeon must make 10 responses in trial 1, 13 in trial 2, 7 in trial 3
- predictable pauses in the rate of responding decreases in likelihood with VR than FR
Real life examples of VR?
- gambling
- fishing