7 - Schedules and theories Flashcards
What is a schedule of reinforcement?
- A response requirement that must be met to obtain reinforcement
→ indicates what exactly has to be done for the reinforcer to be delivered - Example:
→ How many lever presses are required for the food pellet to be presented?
→ Different response requirements can have dramatically different effects on behavior
What is a continuous reinforcement schedule (CRF)?
- Refers to reinforcement being administered to each instance of a response
- Each specified response is reinforced
- Example:
→ Each time a rat presses the lever, it obtains a food pellet
→ Each time you press ‘power’ the light turns on - Useful when a behaviour is first being shaped or strengthened
→ p.ex: when first conditioning a rat, reinforcement should be given every time; when a child is first learning to brush teeth before bed, you want to praise them each time they do so
What is an intermittent (partial) reinforcement schedule?
- Lies between continuous reinforcement and extinction
- When and how often reinforcement occurs affects learning
→ not reinforcing everything, only at certain times or instances
→ p.ex: only at some lever presses, the rat gets a food pellet
→ p.ex: not every date you have goes well - In the theory that behaviour should be persistent even in the absence of praise
- Two basic kinds of schedules:
1) When (time) = interval schedules
2) How often (number) = ratio schedules
Name the 4 types of intermittent reinforcement schedules.
→ fixed ratio (FR)
→ variable ratio (VR)
→ fixed interval (FI)
→ variable interval (FI)
What is a fixed ratio reinforcement?
- A fixed predictable number of responses is required for each reinforcement
- These schedules are designated FRn where n= the number of responses required
- Examples:
→ On a fixed ratio 5 schedule (FR 5), a rat has to press the lever 5 times to obtain food. - Real life example of FR schedule
→ A kid gets paid $1 for every 5 newspapers delivered (FR5)
An FR-__ is the same as a CRF schedule.
1
What is the expected response to fixed ratio schedules?
- These schedules usually produce rapid rates of responding with short post-reinforcement pauses following (it can be seen in the fixed ratio line)
- Example:
→ On an FR25 schedule, a rat rapidly emits 25 lever presses, eats the food pellet it receives, and then sniffs (pauses) around before emitting more lever presses - A post-reinforcement pause is a short pause following the attainment of each reinforcer
→ i.e. taking a break when having studied for a while
→ Higher ratio requirements produce longer post-reinforcement pauses
→ p.ex: taking a 20min break after after running for 40min; vs. taking a 5min break after running for 20min
In a fixed-ratio schedule, what happens when the schedule of reinforcement changes?
- Moving from a lower ratio requirement to a high ratio requirement should be done gradually
→ p.ex: once lever pressing is well established in a CRF schedule, the requirements can be gradually increased to FR2, to FR5, then FR10 - It should be done gradually to avoid ratio strain or burnout
→ if you raise the requirement too high, there may be a breakdown in the rat’s behaviour (extinction)
→ some people have fixed ratio schedules which can lead to struggle through uni studies
What is a variable ratio schedule?
- Reinforcement is contingent upon a varying, unpredictable number of responses
- Reinforcement follows an average number of responses
- Example:
→ On a variable ratio 5 (VR5) schedule, a rat has to emit an average of 5 lever presses for each food pellet
→ Trial 1: 5 presses = 1 pellet
→ Trial 2: 15 presses = 1 pellet - They generally produce a high and steady rate of response with little or no post-reinforcement pause
→ Each response has the potential to be rewarded
→ it doesn’t level off at any point in time
Give a real-life example of a VR schedule.
- FacilitateS abusive relationships
→ in clinical psych, a therapist will point out that the individuals will go through positive reinforcement which strengthens the relationship, but as the relationship progresses, the reinforcement sometimes becomes intermittent, and in some cases, this process becomes very problematic when one person is following a continuous schedule, while the other follows an intermittent schedule for positive feedback
→ this is an imbalance in the process; the person who is on the continuous schedule stays because of this intermittent reinforcement that you look for - A dog is also on a VR schedule
→ the dog is rewarded after a varying number of tricks, with differing times in between
How do VR schedules link to maladaptive behaviour?
- Each response has the potential to be rewarded
- Gambling
→ The unpredictable nature of gambling results in a very high rate of behavior
What is a fixed-interval schedule?
- Reinforcement is contingent upon the first response after a fixed, predictable period of time
- Example:
→ FI 30-sec schedule: first lever press after a 30-second interval has elapsed results in a food pellet -
Upwardly curve pattern; responses consist of a post-reinforcement pause followed by a gradually increasing rate of response as the interval draws to a close
→ the rat will be emitting a high rate of response with the result that the reinforcer will be attained as soon as it becomes available
→ the rat understands that a certain amount of time has to pass before the reinforcer becomes accessible, it will thus increase its rate - This schedule minimizes efforts that won’t be rewarded
What is a variable interval schedule?
- Reinforcement is contingent upon the first response after a varying, unpredictable (average) period of time
→ with a rat at varying intervals, the first lever press after an averages interval of 30 seconds will result in a pellet; with future intervals varying within 1 and 60 seconds
→ the # of seconds that must pass between accessibility to food pellets could thus vary greatly, but the average of it will always be 30 seconds - Produce a moderate, steady rate of response with little or no post-reinforcement pause
- A steady rate of response ensures you will receive the reinforcer as soon as it becomes available
Which of the 4 schedules produce a moderate, steady rate of response with little or no post-reinforcement pause?
Variable interval schedule
What is a duration schedule?
- Reinforcement is contingent on performing a behaviour continuously throughout a period of time
- Reinforcing the mere performance of an activity with no regard to level of performance can undermine a person’s intrinsic interest in that activity
- p.ex: allowing a kid to watch TV but only if they studied for 2 hours (FD2)
→ but this does not allow to look at what was truly completed during those 2 hours
How do duration schedules differ from interval schedules?
-
Duration schedules require continuous responding for a certain amount of time before being rewarded
→ we can also have a variable duration schedule (VD-schedule) (the behavior must be performed continuously for a variable unpredictable period of time)
→ p.ex: if you wanna give a treat to a rat and wanna schedule their behaviour, you would schedule for the rat to run between 1 - 120 seconds and the average of when they receive a treat would be 60 seconds - Interval schedules require a certain amount of time to pass before a response is rewarded