Schedules of Reinforcement Flashcards
What is the difference between continuous and partial reinforcement?
Continuous reinforcement has:
1. more rapid learning
2. consequences easier to perceive
3. extinction more rapid
Define the ratio (partial) reinforcement schedule
certain number of responses are reinforced
Define the interval (partial) reinforcement schedule
some amount of time must elapse between reinforcements
Define the fixed (partial) reinforcement schedule
reinforcement occurs after a fixed number of responses or fixed time interval
Define the variable (partial) reinforcement schedule
reinforcement occurs after an average number of responses or passage of time
*occurs randomly, so take an average
define a fixed ratio schedule (FR)
reinforcement given after a fixed number of responses
*every X responses, produces an outcome
define a variable ratio schedule (VR)
reinforcement given after a variable number of responses, centered around an average
*every X responses, produces an outcome, BUT X changes with each reinforcer
define post-reinforcement pause
in an FR schedule, it is the time out from responding after each reward.
*the higher the ratio, the longer pause after each reward
provide examples of a fixed ratio (FR) schedule
- a rat must press a lever five times to obtain one food pellet
- factory workers who get paid a flat fee for every 100 pieces they turn out
- migrant farm workers who get paid a fixed amount for every bushel of apples picked
provide examples of a variable ratio (VR) schedule
- gambling / slot machines
- video games
- sports
define a fixed interval schedule (FI)
first correct response after a fixed time interval is reinforced
*after Y seconds, 1 response produces 1 outcome
*behavior before the interval expires has no consequence
define a variable interval schedule (VI)
reinforcement given for first correct response after a variable time interval, centered around an average
*after Y seconds, 1 response produces 1 outcome, but Y changes after each outcome. behavior before the interval expires has no consequence
in an FR schedule, what is the behavior?
steady responding until reinforcement
in an VR schedule, what is the behavior?
constant and high rate of responding
in an FI schedule, what is the behavior?
- at beginning of interval, little/no responding
- increases to rapid rate of responding before interval expiration
in an VI schedule, what is the behavior?
steady but low rate of responding
provide an example of a fixed interval (FI) schedule
- detention from 3 to 4. After arrives, little point in checking watch for first 15/20 mins. worth checking after he estimates 30/40 mins have passed, just in case time is flying by faster than he thinks. As the time gets closer to an hour, might check watch more frequently, not wanting to stay a moment longer than necessary.
- watching clock for appointment
provide an example of a variable interval (VI) schedule
checking for email
define the matching law of choice behavior
response rates to concurrent schedules often correspond to the rate of reinforcement for each schedule
Why are VR and VI so different?
VR: more responses = more reinforcers (got to play to win!)
VI: more responses does not equal more reinforcers (only need to check in)
Rank the 4 schedules from hardest to extinguish to the easiest
Variable ratio
Variable interval
Fixed interval
Fixed ratio