Chapter 7 - Schedules Of Reinforcement Flashcards
Continuous reinforcement schedule
One in which each specified response is reinforced
Continuous reinforcement schedule example
Each time a dog rolls over on command it gets a treat
Intermittent reinforcement schedules
One in which only some responses are reinforced
Intermittent reinforcement schedule example
Only occasionally when the rat press is a lever it will result in food
Intermittent schedule types
1 fixed ratio 2 variable ratio 3 fixed interval 4 variable interval
Fixed ratio schedule
A fixed number of responses are required for reinforcement
Dense versus lean fixed ratio
Dense fixed ratio is to reinforce every little amount of responses (every 5) while lean fixed ratio is to reinforce after large amounts of responses (every 100)
Fixed ratio schedule leads to
1 higher rates of responding 2 post reinforcement pauses 3 ratio strain
Variable ratio schedule
Requires a varying rate of responses for reinforcement
Variable ratio schedule leads to
Hi steady rate of response without a post reinforcement pause
Fixed interval schedule
Reinforcement is contingent upon the first response after a fixed period of time
Fixed interval schedule leads to
And erotic pattern of responding with a long pause then frequent responding as the reinforcer approaches
Variable interval schedule
Reinforcement occurs with the first response after varying amount of time
Variable interval schedule leads to
A moderate steady rate of responding
Theories of reinforcement
1 Drive reduction theory 2 Premack principle 3 response deprivation hypothesis 4 behavioral bliss point approach
Drive reduction theory
And event is reinforcing to the extent that is associated with the reduction in some type of physiological drive
Drive reduction theory example
Food deprivation produces a hunger drive which causes animal to seek food when food is obtained hunger drive is reduced. Which strengthens the behavior that preceded in drive reduction
Premack principle
That reinforcers can often be viewed as behaviors rather than stimuli
Premack Principle example
Rather than saying a lever being pressed was reinforced by food, we could say that lever being pressed was reinforced by act of eating food
Response deprivation hypothesis
Behavior can serve as a reinforcer when 1 access to the behavior is restricted and 2 it’s frequency falls below it’s preferred level of occurrence
Response deprivation hypothesis rat example
Of a rat is conditioned to read for one hour a day but is taken away from it and can only run 15 minutes a day, it will start to feel deprived and will do more to receive additional time
Behavioral bliss point approach
An organism with free access to alternative activities will distribute his behavior in such a way to maximize overall reinforcement
Behavioral bliss point approach example
A rat is given free time to run on the wheel and explore a maze, rat reaches its behavioral bliss point by spending one hour day running on the wheel in two hours exploring the maze
Schedules of reinforcement
Response requirement that must be met to obtain reinforcement