PSYC*2330 Chapter 6: Schedules of Reinforcement and Choice Behaviour Flashcards
What is a schedule of reinforcement?
A program, or rule, that determines how and when the occurrence of a response will be followed by the delivery of a reinforcer
T or F: Schedule effects are highly relevant to motivation of behaviour.
True
What is more important in terms of how motivated a person is, their personality, or the schedule of reinforcement in effect?
The schedule of reinforcement
What is a cumulative record?
A graphical representation of how a response is repeated overtime
What are simple schedules of reinforcement?
Schedules in which a single factor determines which instrumental response is reinforced
What is continuous reinforcement?
A type of reinforcement schedule in which every occurrence of the instrumental response is reinforced
What is intermittent/partial reinforcement?
A type of reinforcement schedule in which the instrumental response is only reinforced occasionally
What are ratio schedules of reinforcement?
Schedules in which reinforcement depends on the number of responses the participant performs
What is a fixed-ratio schedule of reinforcement?
Schedules in which a fixed number of responses leads to reinforcement
Which simple schedule of intermittent reinforcement has a cumulative record that shows a steady and moderate rate of responding with brief, predictable pauses?
Fixed-ratio
What is the post-reinforcement pause?
A pause in responding just after reinforcement in an FR schedule
What is the post-reinforcement pause also known as?
The pre-ratio pause
What is a ratio run?
The high and steady rate of responding that completes each ratio requirement
What is ratio strain?
A disruption in responding after a ratio requirement is increased too quickly
What is a variable-ratio schedule of reinforcement?
Schedules in which the number of responses to obtain a reinforcer varies from one reinforcement to the next
T or F: The VR schedule is labelled based on the number of responses required for the first reinforcement.
False. Labelled based on the average number of responses per reinforcer.
Which simple schedule of intermittent reinforcement has a cumulative record that shows a steep and steady rate of responding with no pauses?
Variable-ratio
What are interval schedules of reinforcement?
Schedules in which a response is reinforced only if it occurs after a certain amount of time has passed
What is a fixed-interval schedule of reinforcement?
Schedules in which the amount of time that has to pass to obtain a reinforcer is constant
What term refers to the gradual increase in the rate of responding that occurs between successive reinforcements on an FI schedule?
The fixed interval scallop
Which simple schedule of intermittent reinforcement has a cumulative record that shows a slower response rate immediately after reinforcement, but gradually increases between trials?
Fixed-ratio
T or F: Performance on FI schedules of reinforcement reflects temporal awareness in animals.
True
What is a variable-interval schedule of reinforcement?
Schedules in which the amount of time that has to pass to obtain a reinforcer varies from one reinforcer to the next
Which simple schedule of intermittent reinforcement has a cumulative record that shows a steady rate of responding with no pauses, but is less steep than that of a VR ?
Variable-interval
What type of schedule of reinforcement involves increasing response requirements for reinforcer delivery over successive sessions?
Progressive Ratio
What term refers to the last completed ratio in an escalating series/ progressive ratio?
The breaking point
What does it mean when there’s a limited hold on a variable-interval schedule of reinforcement?
In order for a response to be reinforced, it must occur before the end of the limited hold period
T or F: Response rate is not simply a function of how many reinforcers can be earned.
True
When a VI pigeon was yoked to the VR pigeon, who showed higher rates of responding?
The VR pigeon
Do ratio schedules reinforce long or short inter-response times? Why?
Short because the faster the ratio is completed, the faster reinforcement will be provided
Do interval schedule reinforce long or short inter-response times? Why?
Long because the more the time between responding, the more likely the interval has passed, and the response will be reinforced
Does explaining why VR schedules result in higher response rates than VI schedules in terms of reinforcement of inter-response times take a molecular or molar approach?
Molecular
What is the relationship between response rates and reinforcement calculated over an entire experimental session/ an extended period of time?
A feedback function
T or F: Reinforcement is considered to be the feedback of responding.
True
Is the response rate directly related to the reinforcement rate in ratio or interval schedules?
Ratio