Chapter 6 Flashcards

You may prefer our related Brainscape-certified flashcards:
0
Q

Schedule effects

A

Distinctive rate and pattern of responding associated with the particular pattern of reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
1
Q

Schedules of reinforcement

A

The manner in which reinforcers for behavior are delivered

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Continuous reinforcement

A

Schedule in which a target behavior is reinforced every time it occurs
Simplest
Rare in natural environment
Good to start with for training

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Intermittent schedules

A

Behavior is reinforced on some occasions but not others

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Fixed ratio schedule

A

A behavior is reinforced but it has occurred a fixed number of times

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Post reinforcement pauses

A

Drop off in target behavior that are observed after reinforcement has been delivered in fixed ratio schedules
The more work required for each reinforcement the longer the pause

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Run rate

A

The rate at which behavior occurs once it has resumed, following reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Variable ratio schedules

A

Number of behaviors required in order to earn reinforcement Varies around an average
Typically produces more behavior in a span of time going to comparable fixed ratio schedule because they can’t predict when they’re going to be reinforced next
Common in natural environment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Fixed interval schedules

A

Target behavior is reinforced the first time it occurs after a specific interval of time
A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Post reinforcement pauses (fixed interval schedules)

A

Usually almost no target behaviors immediately after reinforcement has been received, and then steady increase to a high rate of behavior by the end of the interval

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Variable interval schedules

A

Length of the interval during which performance is not reinforced varies around some average
Research shows that animals have some preference for particular types of reinforcement schedules even when, over the long run, the amount of work required is about the same

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Fixed duration schedule

A

Reinforcement is contingent on the continuous performance of the behavior for some constant period of time
Professional athletes have to perform a certain way in order to get their contract payday

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Variable duration schedule

A

Required period of Sustained performance varies around some average amount of time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Differential reinforcement of lower rate DRL

A

Form of differential reinforcement in which a behavior is reinforced only if it occurs no more than a specified number of times in a given period
Clock resets if behavior occurs before the time period had elapsed, do premature behavior only delays reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Differential reinforcement of high rate DRH

A

Behavior is reinforced only if the curse of these to specified number of times in a given period
DRH schedules help increase rate of behavior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Stretching the ratio

A

Gradually modify the schedules of reinforcement so as to progressively increase the amount of behavior that is required to get reinforcement
Important feature of all kinds of training programs

16
Q

Ratio strain

A

A breakdown in the pattern of responding to you to stretching the ratio of reinforcement too abruptly or too far

17
Q

Partial reinforcement effect PRE

A

Schedule effect in which behavior that has been maintained on an intermittent schedule is more resistant to extinction than a behavior that has been on continuous reinforcement

18
Q

Discrimination hypothesis

A

After intermittent reinforcement, it is harder to distinguish or discriminate between extinction and intermittent reinforcement then between extinction and continuous reinforcement
VR schedules make it harder to discriminate than FR schedules due to variance

19
Q

Frustration hypothesis

A

Non reinforcement of once reinforced behaviors is frustrating
Reduction of frustration is negatively reinforcing
Continuous schedules involve no frustration
Intermittent schedules involve frustration but eventually ward reinforces that frustration and frustration becomes a cue for the behavior

20
Q

Sequential hypothesis

A

PRE occurs because of differences in the order of cues during training
During training, our behavior can either be followed by reinforcement or non reinforcement
In continuous schedules, each behaviors followed by reinforcement, which becomes a Cue for behavior (so extinction occurs quickly because an important cue for performance is missing)
In intermittent schedules the sequence of reinforcement and non-reinforcement become the cue for behavior (so extinction is difficult because long strings of unreinforced behavior have previously and reliably come before reinforcement)

21
Q

Complex schedules

A

Various combinations of simple schedules

22
Q

Multiple schedules

A

Two or more simple schedules in effect, each cued by a particular stimulus

23
Q

Cooperative schedule

A

Reinforcement depends on the behavior of two or more individuals

24
Q

Concurrence schedule

A

Two or more schedules are available to the individual at once

25
Q

Choice and matching law

A

Can we predict behavior based on the schedules of reinforcement that are in place?
After testing out different options, organisms will select the option offering the highest rate of reinforcement
Switching back and forth does not make sense in concurrent schedules when both are fixed ratio
Switching does make sense when concurrent schedule involves two interval schedules