Schedules / Reinforcement Quiz Flashcards

1
Q

A _____ of reinforcement is the _____ requirement that must be met in order to obtain reinforcement

A

schedule; response

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

On a _____ reinforcement schedule (abbreviated _____), each response is reinforced, whereas on an _____ reinforcement schedule, only some responses are reinforced. The latter is also called a)n) _____ reinforcement schedule

A

continuous; CRF; intermittent; partial

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Each time you flick the light switch, the light comes on. The behavior of flicking the light switch is a(n) _____ schedule of reinforcement

A

continuous

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

When the weather is very cold, you are sometimes unable to start your car. The behavior of starting your car in very cold weather is a(n) _____ schedule of reinforcement

A

intermittent (partial)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

_____ are the different effects on behavior produced by different response requirements. These are the stable patterns of behavior that emerge once the organism has had sufficient exposure to the schedule. Such stable patterns are know as _____ behaviors

A

schedule effects; steady-state

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

On a(n) _____ schedule, reinforcement is contingent upon a fixed number of responses

A

fixed ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

A schedule in which 15 responses are required for each reinforcer is abbreviated _____

A

FR 15

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

A mother finds that she always has to make the same request three times before her child complies. The mother’s behavior of making requests is on an _____ schedule of reinforcement

A

FR 3

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

An FR 1 schedule of reinforcement can also be called a _____ schedule

A

continuous (CRF)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

A fixed ratio schedule tends to produce a _____ rate of response, along with a _____

A

high; post-reinforcement pause

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

A FR 200 schedule of reinforcement will result in a _____ pause than an FR 50 schedule

A

longer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

The typical FR pattern is sometimes called a _____ pattern, with a _____ pause that is followed immediately by a _____ rate of response

A

break-and-run; post-reinforcement pause; high

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

A FR 12 schedule of reinforcement is _____ than a FR 75 schedule

A

denser

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

A very dense schedule of reinforcement can also be referred to as a very _____ schedule

A

rich

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Over a period of months, Aaron changed from complying with each of his mother’s requests to complying with every other request, then with every third request, and so on. The mother’s behavior of making requests has been subjected to a procedure known as__

A

stretching the ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

On a variable ratio schedule, reinforcement is contingent up a _____ of responses

A

varying, unpredictable number

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

A variable ratio schedule typically produces a _____ rate of behavior _____ a postreinforcement pause

A

high; without

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

An average of 1 in 10 people approached by a panhandler actually gives him money. His behavior of panhandling is on a _____ schedule or reinforcement

A

VR 10

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

As with a FR schedule, an extremely lean VR schedule can result in _____

A

ratio strain

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

On a fixed interval schedule, reinforcement is contingent upon the _____ response following a _____ period of _____

A

first; fixed, predictable; time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Responding on an FI schedule is often characterized by a _____ pattern of responding consisting of a _____ followed by a gradually _____ rate of behavior as the interval draws to a close

A

scalloped; post-reinforcement pause; increasing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

On a pure FI schedule, any response that occurs _____ the interval is irrelevant

A

during

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

On a variable interval schedule, reinforcement is contingent upon the _____ response following a _____ period of _____

A

first; varying, unpredictable; time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

You find that by frequently switching station on you radio, you are able to hear your favorite song an average of once every 20 minutes. Your behavior of switching stations is thus being reinforced on a _____ schedule

A

VI 20-min

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

In general, variable interval schedules produce a _____ and _____ rate of response with little or no _____

A

moderate; steady; post-reinforcement pause

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

In general, _____ tend to produce a high rate of response. This is because the reinforcer in such schedules is entirely _____ contingent, meaning that the rapidity with which responses are emitted _____ greatly affect how soon the reinforcer is obtained

A

ratio schedules; response; does

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

On _____ schedules, the reinforcer is largely time contingent, meaning that the rapidity with which responses are emitted has _____ effect on how quickly the reinforcer is obtained

A

interval; little

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

In general, _____ schedules produce little or no postreinforcement pausing because such schedules often provide the possibility of relatively _____ reinforcement, even if one has just obtained a reinforcer

A

variable; immediate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

In general, _____ schedules produces postreinforcement pauses because obtaining one reinforcer means that the next reinforcer is necessarily quite _____

A

fixed; distant

30
Q

On a _____ schedule, reinforcement is contingent upon responding continuously for a varying period of time; on an _____ schedule, reinforcement is contingent upon the first response after a fixed period of time

A

variable duration; fixed interval

31
Q

As Tessa sits quietly, her mother occasionally givers her a hug as a reward. This is an example of a _____ schedule

A

variable duration

32
Q

In practicing the slow-motion form of exercise known as tai chi, Tung noticed the more slowly he moved, the more thoroughly his muscles relaxed. This is an example of _____ reinforcement of _____ behavior (abbreviated _____)

A

differential; low rates of; DRL

33
Q

On a video game, the faster you destroy all the targets, the more bonus points you obtain. This is an example of _____ reinforcement of _____ behavior (abbreviated _____)

A

differential; high rates of; DRH

34
Q

Frank discovers that his golf shots are much more accurate when he swings the club with a nice, even rhythm that is neither too fast nor too slow. This is an example of _____ reinforcement of _____ behavior (abbreviated _____)

A

differential; paced; DRP

35
Q

On a _____ schedule of reinforcement, a response is not required to obtain a reinforcer. Such a schedule is also called a response _____ schedule of reinforcement

A

noncontingent; independent

36
Q

Every morning at 7:00 a.m. a robin perches outside Marilyn’s bedroom window and begins singing. Given that Marilyn very much enjoys the robin’s song, this is an example of a _____ 24-hour schedule of reinforcement (abbreviated _____)

A

fixed time; FT 24-hour

37
Q

For farmers, rainfall is an example of a noncontingent reinforcer that is typically delivered on a _____ schedule (abbreviated _____)

A

variable time; VT

38
Q

When noncontingent reinforcement happens to follow a particular behavior, that behavior may _____ in strength. Such behavior is referred to as _____ behavior

A

increase; superstitious

39
Q

Herrnstein (1966) noted that superstitious behaviors can sometimes develop as a by-product of _____ reinforcement for some other behavior

A

contingent

40
Q

As shown by the kinds of situations in which superstitious behaviors develop in humans, such behaviors seem most likely to develop on a(n) _____ schedule of reinforcement

A

VT

41
Q

During the time that a rat is responding to a VR 100 schedule, we begin delivering additional food on a VT 60-second schedule. AS a result, the rate of response on the VR schedule is like to _____

A

decrease

42
Q

A child who is often hugged during the course of the day, regardless of what he is doing, is in humanistic terms receiving unconditional positive regard. In behavioral terms, he is receiving a form of _____ social reinforcement.

A

contingent

43
Q

As a result, this child may be _____ likely to act out in order to receive attention

A

less

44
Q

A complex schedule is one that consists of _____

A

two or more simple schedules

45
Q

In a(n) _____ schedule, the response requirement changes as a funtion of the organism’s performance while responding for the previous reinforcer,

A

adjusting

46
Q

while in a(n) _____ schedule, the requirements of two or more simple schedules must be met before the reinforcer is delivered

A

conjunctive

47
Q

To the extent that a gymnast is trying to improve his performance, he is likely on a(n) _____ schedule of reinforcement; to the extent that his performance is judged according to both the form and quickness of his moves, he is on a(n) _____ schedule

A

adjusting; conjunctive

48
Q

A chained schedule consists of a sequence of two or more simple schedules, each of which has its own _____ and the last of which results in a _____

A

discriminative stimulus; terminal reinforcer

49
Q

Within a chain, completion of each of the early links ends in a(n) _____ reinforcer, which also functions as the _____ for the next link of the chain

A

secondary; discriminative stimulus

50
Q

Responding tends to be weaker in the _____ links of a chain. This is an example of the _____ effect in which the strength and/or efficiency of responding _____ as the organism approaches the goal

A

earlier; goal gradient; increases

51
Q

An efficient way to train a complex chain, especially in animals, is through _____ chaining, in which the _____ link of the chain is trained first.

A

backward; last

52
Q

However, this type of procedure usually is not required with verbally proficient humans, with whom behavior chains can be quickly established through _____

A

instructions

53
Q

One suggestion for enhancing our behavior in the early part of a long response chain is to make the completion of each link more _____, thereby enhancing its value as a _____ reinforcer

A

salient; secondary

54
Q

According to drive reduction theory, an event is reinforcing if it is associated with a reduction in some type of _____ drive

A

physiological

55
Q

According to this theory (drive reduction), a _____ reinforcer is one that has been associated with a _____ reinforcer

A

secondary; primary

56
Q

A major problem with drive reduction theory is that _____

A

some behaviors do not seem to be related to a physiological drive

57
Q

The motivation that is derived from some property of the reinforcer is called _____ motivation

A

incentive

58
Q

The Premack principle holds that reinforcers can often be viewed as _____ rather than stimuli. For example, rather than saying that the rat’s lever pressing was reinforced with food, we could say that it was reinforced with _____ food

A

behaviors; eating

59
Q

The Premack principle states that a _____ behavior can be used as a reinforcer for a _____ behavior

A

high probability; low probability

60
Q

According to the Premack principle, if you crack your knuckles 3 times per hour and burp 20 times per hour, the opportunity to _____ can probably be used as a reinforcer for _____

A

burp; cracking your knuckles

61
Q

If you drink five soda pops each day and only one glass of orange juice, then the opportunity to drink _____ can probably be used as a reinforcer for drinking _____

A

soda; orange juice

62
Q

If “Chew bubble gum –> Play video games” is a diagram of a reinforcement procedure based on the Premack principle, then chewing bubble gum must be a _____ probability behavior than playing video games

A

lower

63
Q

What is grandmas rule, and how does it relate to the Premack principle?

A

First you work (LPB), then you play (HPB)

64
Q

According to the response deprivation hypothesis, a response can serve as a reinforcer if free access to the response is _____ and its frequency then falls _____ its baseline level of occurrence

A

restricted; below

65
Q

If a child normally watches 4 hours of television per night, we can make television watching a reinforcer if we restrict free access to the television to _____ than 4 hours per night

A

less

66
Q

The response deprivation hypothesis differs from the Premack principle in that we need only know the baseline frequency of the _____ behavior

A

reinforcing

67
Q

Kaily typically watches television for 4 hours per day and reads comic books for 1 hour per day. You then set up a contingency whereby Kaily must watch 4.5 hours of television each day in order to have access to her comic books.

A

Continue on next slide

68
Q

According to the Premack principle, this will likely be an _____ contingency. According to the response deprivation hypothesis, this would be an _____ contingency

A

ineffective; effective

69
Q

According to the behavioral _____ approach, an organism that _____ engage in alternative activities will distribute its behavior in such a way as to _____ the available reinforcement

A

bliss point; can freely; optimize

70
Q

Contingencies of reinforcement often _____ the distribution of behavior such that it is _____ to obtain the optimal amount of reinforcement

A

disrupt; impossible