PSYC*2330 Chapter 6: Schedules of Reinforcement and Choice Behaviour Flashcards
What is a schedule of reinforcement?
A program, or rule, that determines how and when the occurrence of a response will be followed by the delivery of a reinforcer
T or F: Schedule effects are highly relevant to motivation of behaviour.
True
What is more important in terms of how motivated a person is, their personality, or the schedule of reinforcement in effect?
The schedule of reinforcement
What is a cumulative record?
A graphical representation of how a response is repeated overtime
What are simple schedules of reinforcement?
Schedules in which a single factor determines which instrumental response is reinforced
What is continuous reinforcement?
A type of reinforcement schedule in which every occurrence of the instrumental response is reinforced
What is intermittent/partial reinforcement?
A type of reinforcement schedule in which the instrumental response is only reinforced occasionally
What are ratio schedules of reinforcement?
Schedules in which reinforcement depends on the number of responses the participant performs
What is a fixed-ratio schedule of reinforcement?
Schedules in which a fixed number of responses leads to reinforcement
Which simple schedule of intermittent reinforcement has a cumulative record that shows a steady and moderate rate of responding with brief, predictable pauses?
Fixed-ratio
What is the post-reinforcement pause?
A pause in responding just after reinforcement in an FR schedule
What is the post-reinforcement pause also known as?
The pre-ratio pause
What is a ratio run?
The high and steady rate of responding that completes each ratio requirement
What is ratio strain?
A disruption in responding after a ratio requirement is increased too quickly
What is a variable-ratio schedule of reinforcement?
Schedules in which the number of responses to obtain a reinforcer varies from one reinforcement to the next
T or F: The VR schedule is labelled based on the number of responses required for the first reinforcement.
False. Labelled based on the average number of responses per reinforcer.
Which simple schedule of intermittent reinforcement has a cumulative record that shows a steep and steady rate of responding with no pauses?
Variable-ratio
What are interval schedules of reinforcement?
Schedules in which a response is reinforced only if it occurs after a certain amount of time has passed
What is a fixed-interval schedule of reinforcement?
Schedules in which the amount of time that has to pass to obtain a reinforcer is constant
What term refers to the gradual increase in the rate of responding that occurs between successive reinforcements on an FI schedule?
The fixed interval scallop
Which simple schedule of intermittent reinforcement has a cumulative record that shows a slower response rate immediately after reinforcement, but gradually increases between trials?
Fixed-ratio
T or F: Performance on FI schedules of reinforcement reflects temporal awareness in animals.
True
What is a variable-interval schedule of reinforcement?
Schedules in which the amount of time that has to pass to obtain a reinforcer varies from one reinforcer to the next
Which simple schedule of intermittent reinforcement has a cumulative record that shows a steady rate of responding with no pauses, but is less steep than that of a VR ?
Variable-interval
What type of schedule of reinforcement involves increasing response requirements for reinforcer delivery over successive sessions?
Progressive Ratio
What term refers to the last completed ratio in an escalating series/ progressive ratio?
The breaking point
What does it mean when there’s a limited hold on a variable-interval schedule of reinforcement?
In order for a response to be reinforced, it must occur before the end of the limited hold period
T or F: Response rate is not simply a function of how many reinforcers can be earned.
True
When a VI pigeon was yoked to the VR pigeon, who showed higher rates of responding?
The VR pigeon
Do ratio schedules reinforce long or short inter-response times? Why?
Short because the faster the ratio is completed, the faster reinforcement will be provided
Do interval schedule reinforce long or short inter-response times? Why?
Long because the more the time between responding, the more likely the interval has passed, and the response will be reinforced
Does explaining why VR schedules result in higher response rates than VI schedules in terms of reinforcement of inter-response times take a molecular or molar approach?
Molecular
What is the relationship between response rates and reinforcement calculated over an entire experimental session/ an extended period of time?
A feedback function
T or F: Reinforcement is considered to be the feedback of responding.
True
Is the response rate directly related to the reinforcement rate in ratio or interval schedules?
Ratio
Which type of schedule has an increasing linear feedback function with no theoretical limit, ratio or interval?
Ratio
In which type of schedule is there an upper limit to the number of reinforcers that can be earned regardless of increased rates of responding, ratio or interval?
Interval
Does explaining why VR schedules result in higher response rates than VI schedules in terms of feedback functions take a molecular or molar approach?
Molar
What is the simplest method of studying choice behaviour?
Concurrent schedules
Which type of reinforcement schedules have the purpose of observing how behaviour is distributed across available options?
Concurrent schedules
Which type of reinforcement schedules allow the participant to choose between two or more simple reinforcement schedules that are available simultaneously?
Concurrent schedules
What does the relative rate of responding in a concurrent schedule describe?
How often there is a response towards each of the available alternative simple schedules
How is relative rate of responding calculated for each alternative?
Divide the rate of responding for that alternative by the total number of responses
In a concurrent schedule with two alternatives, what does a relative rate of responding of 0.5 indicate?
Equal distribution of responding on either alternative
In a concurrent schedule with two alternatives, what does a relative rate of responding of 1 indicate?
All responding was allocated to one alternative
T or F: The reinforcement schedule in effect for each alternative in a concurrent schedule has no effect on the relative rates of responding and the relative rates of reinforcement.
False. The reinforcement schedule has enormous influence over both.
What is the formula [BL/(BL+BR)] used to calculate?
The relative rate of responding
In a concurrent schedule with two alternatives, what does a relative rate of reinforcement of 0.5 indicate?
Equal distribution of reinforcement on either alternative
In a concurrent schedule with two alternatives, what does a relative rate of reinforcement of 1 indicate?
All reinforcements were earned from one alternative
What is the formula [rL/(rL+rR)] used to calculate?
The relative rate of reinforcement
What does the matching law state?
The relative rate of responding on a particular response alternative equals the relative rate of reinforcement for that alternative
The formulas [BL/(BL+BR)]=[rL/(rL+rR)] and [BL/BR]=[rL/rR] are used to indicate what?
The matching law
The generalized matching law formula adds what two parameters to the matching law?
- Response bias
- Sensitivity
What is it called when there is a deviation from perfect matching in which there is less sensitivity to the relative rate of reinforcement than predicted by the matching law?
Undermatching
How is the matching law accommodated to account for undermatching?
By making the exponent s less than 1
What are three variables that can influence the sensitivity parameter in the generalized matching law equation?
- Species
- Effort/difficulty of switching between alternatives
- How the schedule alternatives are constructed
In a concurrent schedule of reinforcement, what is the result of the reinforcer for one response being more attractive than the reinforcer for the other?
A response bias
In a concurrent schedule of reinforcement, what is the result of response effort being different between alternatives?
A response bias
How is the matching law accommodated to account for higher responding on one alternative due to a response bias?
By inputting a higher value for b
Is the matching law a mechanistic or descriptive law of nature?
A descriptive law of nature
What reinforcement schedule allows for the study of choice with commitment?
Concurrent-chain schedules
What are the two stages in a concurrent-chain and chain schedule of reinforcement?
- The choice/initial link
- The terminal link
What occurs during the choice link in a concurrent-chain schedule of reinforcement?
Participant chooses between two schedule alternatives by making one of two responses
What occurs during the terminal link in a concurrent-chain schedule of reinforcement?
Participant has the opportunity for reinforcement based on the schedule chosen during the choice link
T or F: In a concurrent-chain schedule, once a choice is made in the choice link, no alternative is available except to complete the terminal link.
True
In a concurrent-chain schedule, when does the response chain “reset”?
When the response required by the terminal link is completed
In a concurrent-chain schedule, is the reinforcer for the choice link a primary or secondary reinforcer?
Secondary
In a concurrent-chain schedule, is the reinforcer for the terminal link a primary or secondary reinforcer?
Primary
T or F: The reinforcer for the choice link is conditioned by the cue (ex. key-light) associated with the terminal link schedule.
True
T or F: Response patterns in a concurrent-chain schedule depend only on the terminal link schedule, not the initial link schedule.
False. Response patterns depend on both the terminal link schedule and the initial link schedule.
When the overall rate of reinforcement is the same, do participants prefer variable schedules or fixed schedules?
Variable schedules
What is the difference between concurrent-chain schedules and chain schedules?
- Concurrent-chain schedules: The response made in the initial link determines the type of responding necessary for reinforcement in the terminal link
- Chain schedules: The response made in the initial link provides access to the terminal link, but does not impact the type of responding needed in the terminal link
In a chain schedule of reinforcement, what must occur to access the terminal link?
The participant must completes the response requirement on the initial link
In a chain schedule of reinforcement, what is the result of completing the response requirement on the terminal link?
Access to a primary reinforcement
T or F: In chain schedules, some rats will continue responding to the initial link in hopes of seeking the cocaine in the terminal link, even when being punished with shock during the initial link.
True
T or F: Rats with extensive cocaine exposure have lower breaking points when completing a progressive ratio schedule for cocaine delivery.
False. They have higher breaking points.
Does the subjectivity of choice increase or decrease when the options vary on more than one dimension?
Increases
Does the value of a reward increase or decrease as a function of the delay needed to wait to obtain it?
Decreases
T or F: An individual’s discounting delay rate is closely linked to personality traits.
True
According to the hyperbolic decay function, is the value of a reinforcer directly or inversely related to the reward magnitude?
Directly related
According to the hyperbolic decay function, is the value of a reinforcer directly or inversely related to the reward decay?
Inversely related
The equation [V]=[M/(1+kD)] represents what function?
The hyperbolic decay function
In the hyperbolic decay function, what does k represent?
The discounting rate parameter
According to the hyperbolic decay function, the value of the reinforcer is directly related to the magnitude (V=M) when the delay (D) is equal to what?
V=M when D=0
T or F: The longer delayed a reinforcer, the larger its value.
False. The longer delayed a reinforcer, the smaller its value.
In the study by Kim et al., did American or Korean students show a steeper discounting delay?
American students discounted more than Korean students
T or F: Steep discounting delays are considered a measure of impulsive choice.
True
Do those with substance use disorders discount rewards more or less rapidly than those without?
More rapidly