Chapter 7: Schedules and Theories of Reinforcement Flashcards

1
Q

Schedule of Reinforcement

A

The response requirement that must be met to obtain reinforcement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Continuous Reinforcement Schedule (or CRF)

A

Each specified response is reinforced.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Which reinforcement schedule is helpful for when behavior is first being shaped?

A

Continuous or CRF

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Intermittent (or Partial) Reinforcement Schedule

A

Only some responses are reinforced.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How many types of Intermittent (or Partial) reinforcement schedules are there? What are the different types?

A
  1. Fixed Ratio
  2. Variable Ratio
  3. Fixed Interval
  4. Variable Interval
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Steady-State Behaviors

A

Stable response patterns that emerges once the organism has had considerable exposure to the schedule.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Schedule Effects

A

The different effects on behavior produced by different response requirements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Fixed Ratio (FR) Schedule

A

Reinforcement is contingent upon a fixed, predictable number of responses.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

FR1 is the same as…

A

Continuous Reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Fixed ratio schedules typically produce a (high/low) rate of response along with a (long/short) pause following the attainment of each reinforcer.

A

high

short

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Short pause is also known as…

A

Post-reinforcement Pause

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why are FR Schedules sometimes referred to as “Break-and-Run”?

A

Because each time a reinforcement is reached, the organism will take a break before proceeding with their goal to reach the next reinforcement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

In Fixed Ratio schedules, _____ ratio requirements result in ____ breaks.

A

high

long

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

An easy FR schedule can be defined as…

A

dense or rich

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

A difficult FR schedule can be defined as…

A

lean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Stretching the Ratio

A

Moving a low ratio requirement (dense) to a high ratio requirement (lean)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Should “stretching the ratio” be done gradually or quickly? Why?

A

Gradually; If the change is made too quickly, behavior may become erratic or die out quickly.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Ratio Strain

A

A disruption in responding due to an overly demanding response requirement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Ratio strain is more commonly known as…

A

burnout

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Variable Ratio (VR) Schedule

A

Reinforcement is contingent upon a varying unpredictable number of responses.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

FR# means

A

The number (#) of responses needed to obtain reinforcement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

VR# means

A

The average number (#) of responses needed to obtain reinforcement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Variable ratio schedules typically produce a (low/high) and (steady/disrupted) rate of response, often with (little or no/long) post-reinforcement pauses.

A

high

steady

little or no

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What makes gamblers a perfect example for variable ratio schedules?

A

Although gamblers lose significant amounts of money in gambling, they are reinforced through their intermittent and unpredictable winnings.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

What makes VR have a high rate of behavior?

A

It’s unpredictability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

As with an FR schedule, an extremely learn VR schedule can result in _______ ________.

A

ratio strain

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Fixed Interval (FI) Schedule

A

Reinforcement is contingent upon the first response after a fixed, predictable period of time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

FI# means

A

The specific amount of time (#) needed to pass before behavior is reinforced.

29
Q

Fixed interval schedules often produce a _________ pattern of responding, consisting of a post-reinforcement pause followed by a gradually (increasing/decreasing) rate of response as the interval draws to a close.

A

“scalloped”

increasing

30
Q

Variable Interval (VI) Schedules

A

Reinforcement is contingent upon the first response after a varying, unpredictable period of time.

31
Q

VI# means

A

The average amount of time (#) needed to pass before behavior is reinforced.

32
Q

Variable interval schedules usually produce a (low/moderate/high), steady rate of response, often with little or no post-reinforcement pause.

A

moderate

33
Q

(Ratio/Interval) schedules produce higher rates of response because the schedule is entirely ______ contingent.

A

Ratio

contingent

34
Q

(Fixed/Variable) schedules produce little to no post-reinforcement pause because such schedules often provide the possibility of relatively _______ reinforcement.

A

Variable

immediate

35
Q

Duration Schedule

A

Reinforcement is contingent on performing a behavior continuously throughout a period of time.

36
Q

Fixed Duration (FD) Schedule

A

The behavior must be performed continuously for a fixed, predictable period of time.

37
Q

FD# means

A

The amount of time (#) a behavior must be performed before getting a reinforcement.

38
Q

Variable Duration (VD) Schedule

A

The behavior must be performed continuously for a varying, unpredictable period of time.

39
Q

VD# means

A

The average amount of time (#) a behavior must be performed before getting a reinforcement.

40
Q

How are duration schedules different than interval schedules?

A

Duration schedules rely on continuous behavior being performed for a certain amount of time where as interval schedules expect behavior to be performed after a certain amount of time.

41
Q

On a pure FI schedule, any response that occurs (during/following) the interval is irrelevant.

A

during

42
Q

On _______ schedules, the reinforcer is largely time contingent, meaning that the rapidity with which responses are emitted has (little/considerable) effect on how soon the reinforcer is obtained.

A

interval

little

43
Q

In general ______ schedules produce post-reinforcement pauses because obtaining one reinforcer means that the next reinforcer is necessarily quite (distant/near).

A

fixed

distant

44
Q

What are the three types of response-rate schedules?

A

1) Differential reinforcement of high rates (DRH)
2) Differential reinforcement of low rates (DRL)
3) Differential reinforcement of paced responding (DRP)

45
Q

What is a response-rate schedule?

A

A reinforcement that is directly contingent upon the organism’s rate of response.

46
Q

What is a “con” of during a duration schedule?

A

Duration schedules sometimes undermine the behavior similar to getting awarded for participating vs quality of performance.

47
Q

Differential Reinforcement of High Rates (DRH)

A

Reinforcement is contingent upon emitting at least a certain number of responses in a certain period of time.
OR
Reinforcement is provided for responding at a fast rate.

48
Q

Differential Reinforcement of Low Rates (DRL)

A

A minimum amount of time must pass between each response before the reinforcer will be delivered.
OR
Reinforcement is provided for responding at a slow rate.

49
Q

Differential Reinforcement

A

One type of response is reinforced while another is not.

50
Q

How do FI schedules differ than DRL schedules?

A

Responses that occur in an FI schedule do not impact the reinforcement, however, in a DRL schedule, an interruption in the duration will prevent the reinforcement from occurring.

51
Q

What is an example of a DRL?

A

Teaching a kid to brush their teeth slowly so they learn the correct technique and to prevent sloppiness.

52
Q

What is an example of a DRH?

A

Racing to the finish line.

53
Q

Differential Reinforcement of Paced Responding (DRP)

A

Reinforcement is contingent upon emitting a series of responses at a set rate.
OR
Reinforcement is provided for responding neither too fast nor too slow.

54
Q

What is an example of DRP?

A

Non-competitive running.

55
Q

On a (VD/VI) schedule, reinforcement is contingent upon responding continuously for a varying period of time; on an (FI/FD) schedule, reinforcement is contingent upon the first response after a fixed period of time.

A

VD

FI

56
Q

As Tessa sits quietly in the doctor’s office, her mother occasionally gives her a hug as a reward. As a result, Tessa is more likely to sit quietly on future visits to the doctor. This is an example of a(n) ________ ________ schedule of reinforcement.

A

variable duration

57
Q

In practicing slow-motion exercise known as tai chi, Tung noticed that the more slowly he moved, the more thoroughly his muscles relaxed. This is an example of (DRL/DRH/DRP).

A

DRL or Differential Reinforcement of Low Rates

58
Q

Non-Contingent Schedule of Reinforcement

A

A response is not required for the reinforcer to be obtained.

59
Q

Non-Contingent Schedule of Reinforcement is also known as

A

Response-Independent Schedules

60
Q

What are the two types of non-contingent reinforcement schedules?

A

1) Fixed Time Schedule

2) Variable Time Schedule

61
Q

Fixed Time (FT) Schedule

A

The reinforcer is delivered following a fixed, predictable period of time, regardless of the organisms behavior.

62
Q

FT schedules deliver ______ reinforcers.

A

“free”

63
Q

FT# means…

A

After # time passes, the reinforcer is delivered.

64
Q

Variable Time (VT) Schedule

A

The reinforcer is delivered following a varying, unpredictable period of time, regardless of the organisms behavior.

65
Q

VT# means…

A

After the average of # time passes, the reinforcer is delivered.

66
Q

How do FT/VT schedules differ from FI/VI schedules?

A

FT and VT schedules do not require the organism to emit a specific response as FI and VI schedules do.

67
Q

Non-contingent schedules can accidentally reinforce a behavior performed (before/after) the reinforcer is provided.

A

before

68
Q

A superstition can arise when a behavior is reinforced on a _____________ schedule.

A

non-contingent

69
Q

Who are two examples that are more prone to the development of superstitions? Why?

A

Gamblers and professional athletes because they associate a specific behavior with a big win and do not want to lose their status.