Lecture 5 Flashcards
A response that must be met to obtain reinforcement.
What is this?
Schedule of reinforcement
What type of reinforcement is this?
Each individual response in reinforced, each lever pressed is rewarded.
Continuous reinforcement schedule
What type schedule of reinforcement lies between continuous reinforcement and extinction.
E.g. A slot machine where you only win some of the time
Intermittent
What type of reinforcement schedule doesn’t offer reinforcements?
What is the consequence of it?
Extinction; decrease the likelihood of the behavior
What are the two kinds of intermittent reinforcement schedules?
Give examples with a slot machine.
Interval schedules = When (time)
E.g. a slot machine that pays every 10 mins
Ratio schedules = How often (number)
E.g. A slot machine that pays every 100 pulls
What is a fixed ratio reinforcement? What is the formula?
What does this type of schedule produce in terms of responding?
What do higher ratios produce?
Give an example.
5pts
- A fixed number of responses is required for each reinforcement
- FRn
n= number of responses required - Produces rapid rates of responding
- Higher ratios lead to longer post-reinforcement pauses
Ex- A kid gets paid $1 for every 5 newspapers delivered (FR5)
What is a variable ratio schedule?
What does this type of schedule produce in terms of responding?
Is there a post-reinforcement pause?
Give an example.
5pts
- varying, unpredictable number of responses
- Reinforcement depends an average number of responses
- Rate of response is high and steady
- little to no post-reinforcement pause
- Dating apps and gambling
- On a variable ratio 5 (VR10) schedule, a rat has to do an average of 10 lever presses for each food pellet
Reinforcement is contingent upon the
first response after a fixed, predictable
period of time.
What type of reinforcement schedule is this?
Give an example.
Fixed interval reinforcement schedule.
FI- 60mins: a patient on a morphine drip will only receive morphine when pressing the dispenser after 60 mins has elapsed.
Reinforcement is contingent upon
the first response after a varying,
unpredictable (average) period of
time.
Produce a moderate, steady rate of response with little or no post-reinforcement pause.
What type of reinforcement schedule is this?
Give an example.
Variable interval schedules
Ex:
Trial 1: Lever press, 5 sec pause, reward
Trial 2: Lever press, 12 sec pause
Trial 3: Lever press, 7 sec pause, reward etc.
Reinforcement is contingent on performing a
behavior continuously throughout a period of time.
What type of schedule is this?
Duration schedule
What is the difference between duration and interval schedules?
Duration: Requires continuous responding
Interval schedules: Requires a certain amount of time to pass before the response is rewarded
What is differential reinforcement of high and low rates (otherwise known as response rate schedules)?
What does it depend on?
Give examples.
- Depends on the organisms rate of response
- High rate: high rate of response is reinforced
Ex- winning a race - low rate: low rate of response is reinforced
Ex- being praised for eating your food slowly
- The reinforcer is delivered independently of any response.
- A response is not required for the reinforcer to be
obtained. - ‘Free’ reinforcer
What type of schedule is this? What are the 2 types.
Give an example example.
Non-contingent schedule –> Fixed time and variable time
- FT (fixed time): A behavior is reinforced after a fixed period of time regardless of its behavior
- Ex: Receiving a gift on your birthday every year
Non-contingent reinforcement may account for
some forms of…
- Behaviors may be accidentally reinforced by the
coincidental presentation of reinforcement
What type of behavior is this?
Superstitious behavior
A fox runs after a rabbit in the hopes of securing
his dinner. However, he doesn’t always succeed.
What type of reinforcement schedule is involved?
Response rate with variable ratio
You have to sit in the waiting room for approximately
60 minutes before your doctor’s appointment.
a. Intermittent VI60
b. Intermittent VR60
c. Continuous
d. Response rate
a. intermittent VI60
Forever 21 has its annual sale at the same time every year:
(A)Fixed ratio
(B)Fixed interval
(C) Variable ratio
(D) Variable interval
b. fixed interval
Overview:
What are the 4 intermittent schedules?
What are the 3 simple schedules?
What are the 3 complex schedules?
Fixed interval, variable interval, fixed ratio, variable ratio
Duration, response rate, non-contingent
Conjunctive, adjusting, chained
A type of complex schedule in which
the requirements of two or more
simple schedules must be met before
a reinforcer is delivered.
Wat type of schedule is this?
What type of schedule does it require ?
Give an example.
3pts
Conjunctive schedule
In order to get paid: You have to work a certain amount of hours a week (FD-40hrs) and you have to deliver a certain number of papers (FR 50).
- Requires both a ratio and an interval schedule
- A sequence of two or more simple schedules, each of which has its own SD and the last of which results in a terminal reinforcer.
- Must be completed in a particular order
What type of complex schedule is this?
Give an example.
Chained schedule
Paying for items at the grocery store.
- Line up at the cash
–> variable Interval - Wait for the prompt on debit machine
–> fixed interval
An increase in the strength and/or efficiency of responding as one draws near to the goal.
What concept is this?
Goal gradient effect
- Training the final link first and the initial link last,
in order to make the chain more effective. - The sight of each stimulus is both a secondary
reinforcer for the previous behavior and a
discriminative stimulus for the next behavior.
What type of chaining is this?
Backward chaining
An event is reinforcing to the extent that it is
associated with a reduction in some type of
physiological drive.
What theory is this?
What is a limitation?
Drive reduction theory
Not all behaviors appear associated with reduction in a physiological drive.
What is the Premack Principle?
What does it focus on?
Give an example.
3pts
A high-probability behavior (the reinforcer) can be used to reinforce a low probability behavior.
- Focus is on how a reinforcer can increase the future likelihood of a behavior
Example: Lever pressing (low probability behavior) reinforced by eating food (high probability behavior)
What is the response deprivation hypothesis?
Explain what is happening in this example:
- If given free access, Noah might play on his iPhone for 3-hrs per night.
- If his access to his phone is restricted to only 15 minutes per day, it he will be unable to reach his preferred level
4pts
A behavior can serve as a reinforcer when: 1. access to the behavior is restricted and
2. its frequency falls below the preferred level of occurrence
- Noah will be in a state of deprivation with regard to playing on his phone
- He will now be wiling to work to obtain additional time on his phone
For variable intervals/ratios, what does the word variable imply?
- An approximate, an average
What does an adjusting schedule depend on?
Performance