Chapter 6 Flashcards

1
Q

concurrent-chain schedule of reinforcement

A

A complex reinforcement procedure in which the participant is permitted to choose during the first link which of several simple reinforcement schedules will be in effect in the second link. Once a choice has been made, the rejected alternatives become unavailable until the start of the next trial. Concurrent-chain schedules allow for the study of choice with commitment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

concurrent schedule

A

A complex reinforcement procedure in which the participant can choose any one of two or more simple reinforcement schedules that are available simultaneously. Concurrent schedules allow for the measurement of direct choice between simple schedule alternatives.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

continuous reinforcement (CRF)

A

A schedule of reinforcement in which every occurrence of the instrumental response produces the reinforcer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

cumulative record

A

A graphical representation of how a response is repeated over time, with the passage of time represented by the horizontal distance (or x axis), and the total or cumulative number of responses that have occurred up to a particular point in time represented by the vertical distance (or y axis).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

delay discounting

A

Decrease in the value of a reinforcer as a function of how long one has to wait to obtain it.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

fixed-interval scallop

A

The gradually increasing rate of responding that occurs between successive reinforcements on a fixed-interval schedule.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

fixed-interval schedule (FI)

A

A reinforcement schedule in which the reinforcer is delivered for the first response that occurs after a fixed amount of time following the last reinforcer or the beginning of the trial.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

fixed ratio schedule (FR)

A

reinforcement schedule in which a fixed number of responses must occur in order for the next response to be reinforced

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

intermittent reinforcement

A

A schedule of reinforcement in which only some of the occurrences of the instrumental response are reinforced. The instrumental response is reinforced occasionally, or intermittently. Also called partial reinforcement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

inter-response time (IRT)

A

The interval between one response and the next. IRTs can be differentially reinforced in the same fashion as other aspects of behavior, such as response force or response variability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

interval schedule

A

A reinforcement schedule in which a certain amount of time is required to set up the reinforcer. A response is reinforced only if it occurs after the reinforcer has been set up.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

limited hold

A

A restriction on how long a reinforcer remains available. In order for a response to be reinforced, it must occur before the end of the limited-hold period.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

matching law

A

A rule for instrumental behavior, proposed by R. J. Herrnstein, which states that the relative rate of responding on a particular response alternative
equals the relative rate of reinforcement for that
response alternative.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

melioration

A

A mechanism for achieving matching by responding so as to improve the local rates of reinforcement for response alternatives.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

partial reinforcement

A

same as intermittent reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

post-reinforcement pause

A

pause in responding that typically occurs after the delivery of the reinforcer on FR and FI schedules of reinforcement

17
Q

ratio run

A

The high and invariant rate of responding observed after the post-reinforcement pause on FR schedules. The ratio run ends when the ratio requirement has been completed and the participant is reinforced.

18
Q

ratio schedule

A

A schedule in which reinforcement depends only on the number of responses the participant performs, irrespective of when those responses occur.

19
Q

ratio strain

A

Disruption of responding that occurs on ratio schedules when the response requirement is increased too rapidly.

20
Q

schedule of reinforcement

A

A program, or rule, that determines how and when the occurrence of a response will be followed by the delivery of the reinforcer.

21
Q

undermatching

A

Less sensitivity to the relative rate of reinforcement than predicted by the matching law.

22
Q

variable-interval schedule (VI)

A

A reinforcement schedule in which reinforcement is provided for the first response that occurs after a variable amount of time from the last reinforcer or the start of the
trial.

23
Q

variable-ratio schedule (VR)

A

A reinforcement schedule in which reinforcement is provided for the first response that occurs after a variable amount of time from the last reinforcer or the start of the
trial.