PSYC*2330 Chapter 6: Schedules of Reinforcement and Choice Behaviour Flashcards

1
Q

What is a schedule of reinforcement?

A

A program, or rule, that determines how and when the occurrence of a response will be followed by the delivery of a reinforcer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

T or F: Schedule effects are highly relevant to motivation of behaviour.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is more important in terms of how motivated a person is, their personality, or the schedule of reinforcement in effect?

A

The schedule of reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a cumulative record?

A

A graphical representation of how a response is repeated overtime

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are simple schedules of reinforcement?

A

Schedules in which a single factor determines which instrumental response is reinforced

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is continuous reinforcement?

A

A type of reinforcement schedule in which every occurrence of the instrumental response is reinforced

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is intermittent/partial reinforcement?

A

A type of reinforcement schedule in which the instrumental response is only reinforced occasionally

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are ratio schedules of reinforcement?

A

Schedules in which reinforcement depends on the number of responses the participant performs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is a fixed-ratio schedule of reinforcement?

A

Schedules in which a fixed number of responses leads to reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Which simple schedule of intermittent reinforcement has a cumulative record that shows a steady and moderate rate of responding with brief, predictable pauses?

A

Fixed-ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the post-reinforcement pause?

A

A pause in responding just after reinforcement in an FR schedule

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the post-reinforcement pause also known as?

A

The pre-ratio pause

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is a ratio run?

A

The high and steady rate of responding that completes each ratio requirement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is ratio strain?

A

A disruption in responding after a ratio requirement is increased too quickly

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is a variable-ratio schedule of reinforcement?

A

Schedules in which the number of responses to obtain a reinforcer varies from one reinforcement to the next

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

T or F: The VR schedule is labelled based on the number of responses required for the first reinforcement.

A

False. Labelled based on the average number of responses per reinforcer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Which simple schedule of intermittent reinforcement has a cumulative record that shows a steep and steady rate of responding with no pauses?

A

Variable-ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What are interval schedules of reinforcement?

A

Schedules in which a response is reinforced only if it occurs after a certain amount of time has passed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is a fixed-interval schedule of reinforcement?

A

Schedules in which the amount of time that has to pass to obtain a reinforcer is constant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What term refers to the gradual increase in the rate of responding that occurs between successive reinforcements on an FI schedule?

A

The fixed interval scallop

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Which simple schedule of intermittent reinforcement has a cumulative record that shows a slower response rate immediately after reinforcement, but gradually increases between trials?

A

Fixed-ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

T or F: Performance on FI schedules of reinforcement reflects temporal awareness in animals.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What is a variable-interval schedule of reinforcement?

A

Schedules in which the amount of time that has to pass to obtain a reinforcer varies from one reinforcer to the next

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Which simple schedule of intermittent reinforcement has a cumulative record that shows a steady rate of responding with no pauses, but is less steep than that of a VR ?

A

Variable-interval

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
What type of schedule of reinforcement involves increasing response requirements for reinforcer delivery over successive sessions?
Progressive Ratio
26
What term refers to the last completed ratio in an escalating series/ progressive ratio?
The breaking point
27
What does it mean when there's a limited hold on a variable-interval schedule of reinforcement?
In order for a response to be reinforced, it must occur before the end of the limited hold period
28
T or F: Response rate is not simply a function of how many reinforcers can be earned.
True
29
When a VI pigeon was yoked to the VR pigeon, who showed higher rates of responding?
The VR pigeon
30
Do ratio schedules reinforce long or short inter-response times? Why?
Short because the faster the ratio is completed, the faster reinforcement will be provided
31
Do interval schedule reinforce long or short inter-response times? Why?
Long because the more the time between responding, the more likely the interval has passed, and the response will be reinforced
32
Does explaining why VR schedules result in higher response rates than VI schedules in terms of reinforcement of inter-response times take a molecular or molar approach?
Molecular
33
What is the relationship between response rates and reinforcement calculated over an entire experimental session/ an extended period of time?
A feedback function
34
T or F: Reinforcement is considered to be the feedback of responding.
True
35
Is the response rate directly related to the reinforcement rate in ratio or interval schedules?
Ratio
36
Which type of schedule has an increasing linear feedback function with no theoretical limit, ratio or interval?
Ratio
37
In which type of schedule is there an upper limit to the number of reinforcers that can be earned regardless of increased rates of responding, ratio or interval?
Interval
38
Does explaining why VR schedules result in higher response rates than VI schedules in terms of feedback functions take a molecular or molar approach?
Molar
39
What is the simplest method of studying choice behaviour?
Concurrent schedules
40
Which type of reinforcement schedules have the purpose of observing how behaviour is distributed across available options?
Concurrent schedules
41
Which type of reinforcement schedules allow the participant to choose between two or more simple reinforcement schedules that are available simultaneously?
Concurrent schedules
42
What does the relative rate of responding in a concurrent schedule describe?
How often there is a response towards each of the available alternative simple schedules
43
How is relative rate of responding calculated for each alternative?
Divide the rate of responding for that alternative by the total number of responses
44
In a concurrent schedule with two alternatives, what does a relative rate of responding of 0.5 indicate?
Equal distribution of responding on either alternative
45
In a concurrent schedule with two alternatives, what does a relative rate of responding of 1 indicate?
All responding was allocated to one alternative
46
T or F: The reinforcement schedule in effect for each alternative in a concurrent schedule has no effect on the relative rates of responding and the relative rates of reinforcement.
False. The reinforcement schedule has enormous influence over both.
47
What is the formula [BL/(BL+BR)] used to calculate?
The relative rate of responding
48
In a concurrent schedule with two alternatives, what does a relative rate of reinforcement of 0.5 indicate?
Equal distribution of reinforcement on either alternative
49
In a concurrent schedule with two alternatives, what does a relative rate of reinforcement of 1 indicate?
All reinforcements were earned from one alternative
50
What is the formula [rL/(rL+rR)] used to calculate?
The relative rate of reinforcement
51
What does the matching law state?
The relative rate of responding on a particular response alternative equals the relative rate of reinforcement for that alternative
52
The formulas [BL/(BL+BR)]=[rL/(rL+rR)] and [BL/BR]=[rL/rR] are used to indicate what?
The matching law
53
The generalized matching law formula adds what two parameters to the matching law?
- Response bias - Sensitivity
54
What is it called when there is a deviation from perfect matching in which there is less sensitivity to the relative rate of reinforcement than predicted by the matching law?
Undermatching
55
How is the matching law accommodated to account for undermatching?
By making the exponent s less than 1
56
What are three variables that can influence the sensitivity parameter in the generalized matching law equation?
- Species - Effort/difficulty of switching between alternatives - How the schedule alternatives are constructed
57
In a concurrent schedule of reinforcement, what is the result of the reinforcer for one response being more attractive than the reinforcer for the other?
A response bias
58
In a concurrent schedule of reinforcement, what is the result of response effort being different between alternatives?
A response bias
59
How is the matching law accommodated to account for higher responding on one alternative due to a response bias?
By inputting a higher value for b
60
Is the matching law a mechanistic or descriptive law of nature?
A descriptive law of nature
61
What reinforcement schedule allows for the study of choice with commitment?
Concurrent-chain schedules
62
What are the two stages in a concurrent-chain and chain schedule of reinforcement?
- The choice/initial link - The terminal link
63
What occurs during the choice link in a concurrent-chain schedule of reinforcement?
Participant chooses between two schedule alternatives by making one of two responses
64
What occurs during the terminal link in a concurrent-chain schedule of reinforcement?
Participant has the opportunity for reinforcement based on the schedule chosen during the choice link
65
T or F: In a concurrent-chain schedule, once a choice is made in the choice link, no alternative is available except to complete the terminal link.
True
66
In a concurrent-chain schedule, when does the response chain "reset"?
When the response required by the terminal link is completed
67
In a concurrent-chain schedule, is the reinforcer for the choice link a primary or secondary reinforcer?
Secondary
68
In a concurrent-chain schedule, is the reinforcer for the terminal link a primary or secondary reinforcer?
Primary
69
T or F: The reinforcer for the choice link is conditioned by the cue (ex. key-light) associated with the terminal link schedule.
True
70
T or F: Response patterns in a concurrent-chain schedule depend only on the terminal link schedule, not the initial link schedule.
False. Response patterns depend on both the terminal link schedule and the initial link schedule.
71
When the overall rate of reinforcement is the same, do participants prefer variable schedules or fixed schedules?
Variable schedules
72
What is the difference between concurrent-chain schedules and chain schedules?
- Concurrent-chain schedules: The response made in the initial link determines the type of responding necessary for reinforcement in the terminal link - Chain schedules: The response made in the initial link provides access to the terminal link, but does not impact the type of responding needed in the terminal link
73
In a chain schedule of reinforcement, what must occur to access the terminal link?
The participant must completes the response requirement on the initial link
74
In a chain schedule of reinforcement, what is the result of completing the response requirement on the terminal link?
Access to a primary reinforcement
75
T or F: In chain schedules, some rats will continue responding to the initial link in hopes of seeking the cocaine in the terminal link, even when being punished with shock during the initial link.
True
76
T or F: Rats with extensive cocaine exposure have lower breaking points when completing a progressive ratio schedule for cocaine delivery.
False. They have higher breaking points.
77
Does the subjectivity of choice increase or decrease when the options vary on more than one dimension?
Increases
78
Does the value of a reward increase or decrease as a function of the delay needed to wait to obtain it?
Decreases
79
T or F: An individual's discounting delay rate is closely linked to personality traits.
True
80
According to the hyperbolic decay function, is the value of a reinforcer directly or inversely related to the reward magnitude?
Directly related
81
According to the hyperbolic decay function, is the value of a reinforcer directly or inversely related to the reward decay?
Inversely related
82
The equation [V]=[M/(1+kD)] represents what function?
The hyperbolic decay function
83
In the hyperbolic decay function, what does k represent?
The discounting rate parameter
84
According to the hyperbolic decay function, the value of the reinforcer is directly related to the magnitude (V=M) when the delay (D) is equal to what?
V=M when D=0
85
T or F: The longer delayed a reinforcer, the larger its value.
False. The longer delayed a reinforcer, the smaller its value.
86
In the study by Kim et al., did American or Korean students show a steeper discounting delay?
American students discounted more than Korean students
87
T or F: Steep discounting delays are considered a measure of impulsive choice.
True
88
Do those with substance use disorders discount rewards more or less rapidly than those without?
More rapidly