Chapter 6: Reinforcement and Choice Flashcards
Intrinsic reinforcer
obtain reinforcing value while engaging in the behvaiour
intrinsically motivating
social contact
exercise
Extrinsic reinforcer
things that are provided as a result of the behaviour to encourage more behaviour in the future
ex. reading in children
the only way to teach kids to read is to get them to read and this usually involved enticing with them with social reinforcement (saying “good job”) or other kinds of external reinforcement
More reward does not always mean more
____________. Why?
Reinforcement
bonus for making more parts (i.e. over 50)
only diff between the group that does not get a bonus and the groups that did get a bonus (no difference between the groups who all got bonus but diff bonus values, they all were equally reinforced)
aversives can __________ behaviour
reinforce
the aversiveness drives the behaviour!
Continuous Reinforcement
Behaviour is reinforced every time it occurs
Ratio Schedules
Reinforcer is given after the animal makes the required number of responses
Fixed ratio (FR):
fixed ratio between the number of responses made and reinforcers delivered (e.g., FR 10)
• Key elements: postreinforcement pause, ratio run, and ratio strain (going from 1 peck to 100 pecks, subjects tend to stop responding)
see graph slide 9
Cumulative Record
Based on old cumulative recorder device (Constant paper output, pen jumps with each response)
rate of responding across time!
Variable Ratio (VR):
Different number of responses are required for the delivery of each reinforcer
Value is equal to the average number of
responses made to receive a reinforcer (VR 5)
Responding based on average VR and
minimum
ex. gambling
see graph slide 19 (steep slope of responding
responding at high rate, no postreinforcement pause!)
Interval Schedules
Responses are only reinforced if the response
occurs after a certain time interval.
Fixed interval (FI):
a response is reinforced only if it occurs more than a set amount of time (responses during the interval don’t matter)
Key elements: fixed interval scallop, limited
hold
i. e. have to wait 10 secs, after 10 secs has elapsed, the first peck at the key will gain them the reward!
ex. cramming before tests
see graph slide 16 (low rates of responding
scallopes responding, post-reinforcement pause!)
Variable interval (VI):
responses are reinforced if they occur after a variable interval of time
see graph slide 19
Reynolds 1975
Ratio and Interval Schedules Compared
• Compared rates of key pecking of pigeons on
VR and VI schedules
• Opportunities for reinforcement were made
identical for each bird
• The VI bird could receive reward when the VR
bird was within one response of its reward
With equivalent rate of reinforcement, variable ratio schedules produce a higher rate of responding than variable interval schedules
Variable schedules produce _________ responding compared to Fixed
Variable schedules produce steadier responding compared to Fixed
fixed = post reinforcement pause
Ratio schedules produce ________ of responding than Interval
Ratio schedules produce higher rates of responding than Interval
Source of Differences Between Ratio and Interval Schedules:
Differential reinforcement of Inter-response times
Ratio schedules reinforce shorter IRTs
Interval schedules reinforce longer IRTs
Source of Differences Between Ratio and Interval Schedules: Feedback function
More feedback (reinforcement) comes with more responding on Ratio schedules; not so for Interval Schedules (different jobs differ on this aspect)
Intermittent Schedules
Fewer reinforcers needed
More resistant to extinction
Variable reinforcement/interval schedules are resistant to intinction
Differential reinforcement of high rates (DRH)
Minimum ratio per interval
Differential reinforcement of low rates (DRL)
Maximum ratio per interval
to get rid of a behaviour if you don’t hate it but want it to occur less