PSYC*2330 Chapter 6: Schedules of Reinforcement and Choice Behaviour Flashcards

1
Q

What is a schedule of reinforcement?

A

A program, or rule, that determines how and when the occurrence of a response will be followed by the delivery of a reinforcer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

T or F: Schedule effects are highly relevant to motivation of behaviour.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is more important in terms of how motivated a person is, their personality, or the schedule of reinforcement in effect?

A

The schedule of reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a cumulative record?

A

A graphical representation of how a response is repeated overtime

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are simple schedules of reinforcement?

A

Schedules in which a single factor determines which instrumental response is reinforced

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is continuous reinforcement?

A

A type of reinforcement schedule in which every occurrence of the instrumental response is reinforced

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is intermittent/partial reinforcement?

A

A type of reinforcement schedule in which the instrumental response is only reinforced occasionally

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are ratio schedules of reinforcement?

A

Schedules in which reinforcement depends on the number of responses the participant performs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is a fixed-ratio schedule of reinforcement?

A

Schedules in which a fixed number of responses leads to reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Which simple schedule of intermittent reinforcement has a cumulative record that shows a steady and moderate rate of responding with brief, predictable pauses?

A

Fixed-ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the post-reinforcement pause?

A

A pause in responding just after reinforcement in an FR schedule

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the post-reinforcement pause also known as?

A

The pre-ratio pause

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is a ratio run?

A

The high and steady rate of responding that completes each ratio requirement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is ratio strain?

A

A disruption in responding after a ratio requirement is increased too quickly

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is a variable-ratio schedule of reinforcement?

A

Schedules in which the number of responses to obtain a reinforcer varies from one reinforcement to the next

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

T or F: The VR schedule is labelled based on the number of responses required for the first reinforcement.

A

False. Labelled based on the average number of responses per reinforcer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Which simple schedule of intermittent reinforcement has a cumulative record that shows a steep and steady rate of responding with no pauses?

A

Variable-ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What are interval schedules of reinforcement?

A

Schedules in which a response is reinforced only if it occurs after a certain amount of time has passed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is a fixed-interval schedule of reinforcement?

A

Schedules in which the amount of time that has to pass to obtain a reinforcer is constant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What term refers to the gradual increase in the rate of responding that occurs between successive reinforcements on an FI schedule?

A

The fixed interval scallop

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Which simple schedule of intermittent reinforcement has a cumulative record that shows a slower response rate immediately after reinforcement, but gradually increases between trials?

A

Fixed-ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

T or F: Performance on FI schedules of reinforcement reflects temporal awareness in animals.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What is a variable-interval schedule of reinforcement?

A

Schedules in which the amount of time that has to pass to obtain a reinforcer varies from one reinforcer to the next

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Which simple schedule of intermittent reinforcement has a cumulative record that shows a steady rate of responding with no pauses, but is less steep than that of a VR ?

A

Variable-interval

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

What type of schedule of reinforcement involves increasing response requirements for reinforcer delivery over successive sessions?

A

Progressive Ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

What term refers to the last completed ratio in an escalating series/ progressive ratio?

A

The breaking point

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

What does it mean when there’s a limited hold on a variable-interval schedule of reinforcement?

A

In order for a response to be reinforced, it must occur before the end of the limited hold period

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

T or F: Response rate is not simply a function of how many reinforcers can be earned.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

When a VI pigeon was yoked to the VR pigeon, who showed higher rates of responding?

A

The VR pigeon

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Do ratio schedules reinforce long or short inter-response times? Why?

A

Short because the faster the ratio is completed, the faster reinforcement will be provided

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

Do interval schedule reinforce long or short inter-response times? Why?

A

Long because the more the time between responding, the more likely the interval has passed, and the response will be reinforced

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Does explaining why VR schedules result in higher response rates than VI schedules in terms of reinforcement of inter-response times take a molecular or molar approach?

A

Molecular

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

What is the relationship between response rates and reinforcement calculated over an entire experimental session/ an extended period of time?

A

A feedback function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

T or F: Reinforcement is considered to be the feedback of responding.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

Is the response rate directly related to the reinforcement rate in ratio or interval schedules?

A

Ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

Which type of schedule has an increasing linear feedback function with no theoretical limit, ratio or interval?

A

Ratio

37
Q

In which type of schedule is there an upper limit to the number of reinforcers that can be earned regardless of increased rates of responding, ratio or interval?

A

Interval

38
Q

Does explaining why VR schedules result in higher response rates than VI schedules in terms of feedback functions take a molecular or molar approach?

A

Molar

39
Q

What is the simplest method of studying choice behaviour?

A

Concurrent schedules

40
Q

Which type of reinforcement schedules have the purpose of observing how behaviour is distributed across available options?

A

Concurrent schedules

41
Q

Which type of reinforcement schedules allow the participant to choose between two or more simple reinforcement schedules that are available simultaneously?

A

Concurrent schedules

42
Q

What does the relative rate of responding in a concurrent schedule describe?

A

How often there is a response towards each of the available alternative simple schedules

43
Q

How is relative rate of responding calculated for each alternative?

A

Divide the rate of responding for that alternative by the total number of responses

44
Q

In a concurrent schedule with two alternatives, what does a relative rate of responding of 0.5 indicate?

A

Equal distribution of responding on either alternative

45
Q

In a concurrent schedule with two alternatives, what does a relative rate of responding of 1 indicate?

A

All responding was allocated to one alternative

46
Q

T or F: The reinforcement schedule in effect for each alternative in a concurrent schedule has no effect on the relative rates of responding and the relative rates of reinforcement.

A

False. The reinforcement schedule has enormous influence over both.

47
Q

What is the formula [BL/(BL+BR)] used to calculate?

A

The relative rate of responding

48
Q

In a concurrent schedule with two alternatives, what does a relative rate of reinforcement of 0.5 indicate?

A

Equal distribution of reinforcement on either alternative

49
Q

In a concurrent schedule with two alternatives, what does a relative rate of reinforcement of 1 indicate?

A

All reinforcements were earned from one alternative

50
Q

What is the formula [rL/(rL+rR)] used to calculate?

A

The relative rate of reinforcement

51
Q

What does the matching law state?

A

The relative rate of responding on a particular response alternative equals the relative rate of reinforcement for that alternative

52
Q

The formulas [BL/(BL+BR)]=[rL/(rL+rR)] and [BL/BR]=[rL/rR] are used to indicate what?

A

The matching law

53
Q

The generalized matching law formula adds what two parameters to the matching law?

A
  • Response bias
  • Sensitivity
54
Q

What is it called when there is a deviation from perfect matching in which there is less sensitivity to the relative rate of reinforcement than predicted by the matching law?

A

Undermatching

55
Q

How is the matching law accommodated to account for undermatching?

A

By making the exponent s less than 1

56
Q

What are three variables that can influence the sensitivity parameter in the generalized matching law equation?

A
  • Species
  • Effort/difficulty of switching between alternatives
  • How the schedule alternatives are constructed
57
Q

In a concurrent schedule of reinforcement, what is the result of the reinforcer for one response being more attractive than the reinforcer for the other?

A

A response bias

58
Q

In a concurrent schedule of reinforcement, what is the result of response effort being different between alternatives?

A

A response bias

59
Q

How is the matching law accommodated to account for higher responding on one alternative due to a response bias?

A

By inputting a higher value for b

60
Q

Is the matching law a mechanistic or descriptive law of nature?

A

A descriptive law of nature

61
Q

What reinforcement schedule allows for the study of choice with commitment?

A

Concurrent-chain schedules

62
Q

What are the two stages in a concurrent-chain and chain schedule of reinforcement?

A
  • The choice/initial link
  • The terminal link
63
Q

What occurs during the choice link in a concurrent-chain schedule of reinforcement?

A

Participant chooses between two schedule alternatives by making one of two responses

64
Q

What occurs during the terminal link in a concurrent-chain schedule of reinforcement?

A

Participant has the opportunity for reinforcement based on the schedule chosen during the choice link

65
Q

T or F: In a concurrent-chain schedule, once a choice is made in the choice link, no alternative is available except to complete the terminal link.

A

True

66
Q

In a concurrent-chain schedule, when does the response chain “reset”?

A

When the response required by the terminal link is completed

67
Q

In a concurrent-chain schedule, is the reinforcer for the choice link a primary or secondary reinforcer?

A

Secondary

68
Q

In a concurrent-chain schedule, is the reinforcer for the terminal link a primary or secondary reinforcer?

A

Primary

69
Q

T or F: The reinforcer for the choice link is conditioned by the cue (ex. key-light) associated with the terminal link schedule.

A

True

70
Q

T or F: Response patterns in a concurrent-chain schedule depend only on the terminal link schedule, not the initial link schedule.

A

False. Response patterns depend on both the terminal link schedule and the initial link schedule.

71
Q

When the overall rate of reinforcement is the same, do participants prefer variable schedules or fixed schedules?

A

Variable schedules

72
Q

What is the difference between concurrent-chain schedules and chain schedules?

A
  • Concurrent-chain schedules: The response made in the initial link determines the type of responding necessary for reinforcement in the terminal link
  • Chain schedules: The response made in the initial link provides access to the terminal link, but does not impact the type of responding needed in the terminal link
73
Q

In a chain schedule of reinforcement, what must occur to access the terminal link?

A

The participant must completes the response requirement on the initial link

74
Q

In a chain schedule of reinforcement, what is the result of completing the response requirement on the terminal link?

A

Access to a primary reinforcement

75
Q

T or F: In chain schedules, some rats will continue responding to the initial link in hopes of seeking the cocaine in the terminal link, even when being punished with shock during the initial link.

A

True

76
Q

T or F: Rats with extensive cocaine exposure have lower breaking points when completing a progressive ratio schedule for cocaine delivery.

A

False. They have higher breaking points.

77
Q

Does the subjectivity of choice increase or decrease when the options vary on more than one dimension?

A

Increases

78
Q

Does the value of a reward increase or decrease as a function of the delay needed to wait to obtain it?

A

Decreases

79
Q

T or F: An individual’s discounting delay rate is closely linked to personality traits.

A

True

80
Q

According to the hyperbolic decay function, is the value of a reinforcer directly or inversely related to the reward magnitude?

A

Directly related

81
Q

According to the hyperbolic decay function, is the value of a reinforcer directly or inversely related to the reward decay?

A

Inversely related

82
Q

The equation [V]=[M/(1+kD)] represents what function?

A

The hyperbolic decay function

83
Q

In the hyperbolic decay function, what does k represent?

A

The discounting rate parameter

84
Q

According to the hyperbolic decay function, the value of the reinforcer is directly related to the magnitude (V=M) when the delay (D) is equal to what?

A

V=M when D=0

85
Q

T or F: The longer delayed a reinforcer, the larger its value.

A

False. The longer delayed a reinforcer, the smaller its value.

86
Q

In the study by Kim et al., did American or Korean students show a steeper discounting delay?

A

American students discounted more than Korean students

87
Q

T or F: Steep discounting delays are considered a measure of impulsive choice.

A

True

88
Q

Do those with substance use disorders discount rewards more or less rapidly than those without?

A

More rapidly