Lecture 5 Flashcards
A response that must be met to obtain reinforcement.
What is this?
Schedule of reinforcement
What type of reinforcement is this?
Each individual response in reinforced, each lever pressed is rewarded.
Continuous reinforcement schedule
What type schedule of reinforcement lies between continuous reinforcement and extinction.
E.g. A slot machine where you only win some of the time
Intermittent
What type of reinforcement schedule doesn’t offer reinforcements?
What is the consequence of it?
Extinction; decrease the likelihood of the behavior
What are the two kinds of intermittent reinforcement schedules?
Give examples with a slot machine.
Interval schedules = When (time)
E.g. a slot machine that pays every 10 mins
Ratio schedules = How often (number)
E.g. A slot machine that pays every 100 pulls
What is a fixed ratio reinforcement? What is the formula?
What does this type of schedule produce in terms of responding?
What do higher ratios produce?
Give an example.
5pts
- A fixed number of responses is required for each reinforcement
- FRn
n= number of responses required - Produces rapid rates of responding
- Higher ratios lead to longer post-reinforcement pauses
Ex- A kid gets paid $1 for every 5 newspapers delivered (FR5)
What is a variable ratio schedule?
What does this type of schedule produce in terms of responding?
Is there a post-reinforcement pause?
Give an example.
5pts
- varying, unpredictable number of responses
- Reinforcement depends an average number of responses
- Rate of response is high and steady
- little to no post-reinforcement pause
- Dating apps and gambling
- On a variable ratio 5 (VR10) schedule, a rat has to do an average of 10 lever presses for each food pellet
Reinforcement is contingent upon the
first response after a fixed, predictable
period of time.
What type of reinforcement schedule is this?
Give an example.
Fixed interval reinforcement schedule.
FI- 60mins: a patient on a morphine drip will only receive morphine when pressing the dispenser after 60 mins has elapsed.
Reinforcement is contingent upon
the first response after a varying,
unpredictable (average) period of
time.
Produce a moderate, steady rate of response with little or no post-reinforcement pause.
What type of reinforcement schedule is this?
Give an example.
Variable interval schedules
Ex:
Trial 1: Lever press, 5 sec pause, reward
Trial 2: Lever press, 12 sec pause
Trial 3: Lever press, 7 sec pause, reward etc.
Reinforcement is contingent on performing a
behavior continuously throughout a period of time.
What type of schedule is this?
Duration schedule
What is the difference between duration and interval schedules?
Duration: Requires continuous responding
Interval schedules: Requires a certain amount of time to pass before the response is rewarded
What is differential reinforcement of high and low rates (otherwise known as response rate schedules)?
What does it depend on?
Give examples.
- Depends on the organisms rate of response
- High rate: high rate of response is reinforced
Ex- winning a race - low rate: low rate of response is reinforced
Ex- being praised for eating your food slowly
- The reinforcer is delivered independently of any response.
- A response is not required for the reinforcer to be
obtained. - ‘Free’ reinforcer
What type of schedule is this? What are the 2 types.
Give an example example.
Non-contingent schedule –> Fixed time and variable time
- FT (fixed time): A behavior is reinforced after a fixed period of time regardless of its behavior
- Ex: Receiving a gift on your birthday every year
Non-contingent reinforcement may account for
some forms of…
- Behaviors may be accidentally reinforced by the
coincidental presentation of reinforcement
What type of behavior is this?
Superstitious behavior
A fox runs after a rabbit in the hopes of securing
his dinner. However, he doesn’t always succeed.
What type of reinforcement schedule is involved?
Response rate with variable ratio