Exam 3 Re-Do Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

E.L. Thorndike’s studies of lerning started as an attempt to understand _______.

A

animal intelligence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

The law of effect says that _____.

A

behavior is a function of its consequences

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

The training procedure Thorndike used in his famous experiments with cats is best described as _____.

A

discrete trial

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

The free operant procedure is most associated with ______.

A

Skinner

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Studeis of delayed reinforcement document the importance _______.

A

contiguity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

The level of deprivation is less important when the reinforcer used is a/an ______ reinforcer.

A

secondary

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

The one thing that all reinforcers have in common is that they ______.

A

strengthen behavior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

All of the following are useful tips for shaping behavior except _____.

A

never back up

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Shaping is the reinforcement of successive _________.

A

approximatioins of a desired behavior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Schlinger and Blakely found that the reinforcing power of a delayed reinforcer could be increased by ______.

A

preceding the reinforcer with a stimulus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

The reappearance of previously effective behavior during extinction is called _____.

A

resurgence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Negative reinforcement is also called ___________.

A

escape-avoidance training

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Thorndike plotted the results of his puzzle box experiements as graphs. The resulting curves show a/an ______ with succeeding trials.

A

decrease in time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Operant learning is sometimes called ________ learning.

A

instrumental

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Clark Hull’s explanation of reinforcement assumes that reinforcers ______.

A

reduce a drive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Money is a good example of a _____ reinforcer.

A

generalized

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Often the initial effect of an extinction procedure is an increase in the behavior called a/an extinction _______.

A

burst

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

According to ______ theory, schoolchildren are eager to go to recess because the have been deprived of the opportunity to exercise

A

repsonse deprivation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Resurgence may help account for _____.

A

regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

_____ is a neurotransmitter that seems to be important in reinforcement.

A

Dopamine

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

John spent his summer picking cantaloupes for a farmer. The farmer paid John a certain amount for every basket of cantaloupes picked. John worked on a _______.

A

fixed ratio schedule

22
Q

The schedule to use if you want to produce the most rapid learning of new behavior is _____.

A

CRF

23
Q

Bill spends his summer in the city panhandling. Every day he takes a position on a busy corner and accosts passersby saying, “Can you spare some change?” Most people ignore him, by every now and then someone gives him money. Bill’s reinforcement schedule is best described as a ______.

A

variable ratio schedule

24
Q

Refer to George’s Pigeons. George is using a procedure called ______.

A

stretching the ratio

25
Q

Refer to George’s pingeons. Things are going pretty well for George until he jumps from reinforcing every tenth response to reinforcing every 50th response. At this point, the pigeon responds erratically and nearly stops responding entirely. George’s pigeon is suffering from ____.

A

ratio strain

26
Q

Stanley wants to determine which of two reinforcement schedules is more attractive to rats. He trains a rate to press a lever for food, and then puts the rat into an experimental chamber containg two levers. Pressing one lever produces reinforcement on an FR 10 schedule, pressing the other lever produces reinforcement on an FI 10” schedule. Lever pressing is on a ______.

A

Concurrent schedule

27
Q

Refer to pigeon study. You predict that the bird will peck_____.

A

the red disk about twice as often as the green disk.

28
Q

Refer to pigeon study. The principle that allows you to predict the behavior of the pigeon is called the ______.

A

matching law

29
Q

A reduction in response rate following reinforcement is called a _____.

A

postreinforcement pause

30
Q

The schedule that is likely to produce a cumulative record with scallops is the _____.

A

FI schedule

31
Q

One explanation for the PRE implies that the effect is really an illusion. This is the _____.

A

response unit hypothesis

32
Q

CRF is synonomous with ____.

A

FR 1

33
Q

In schedules research, VD stands for _____.

A

variable duration

34
Q

Shirley trains a rat to press a lever and then reinforces lever presses on an FR 10 schedule when a red light is on, and an FI 10” schedule when a green light is on. In this case, lever pressing is on a ____.

A

multiple schedule

35
Q

Studies of choice involve _____.

A

concurrent schedules

36
Q

The study of reinforcement shedules suggest that the behavior we call stick-to-itiveness is largely the product of ____.

A

reinforcement history

37
Q

Your text reports the case of a man who apparently made hundreds of harrassing phone calls. The man’s behavior was most likely on a/an _____.

A

VR schedule

38
Q

A schedule that does not require the performance of a particular behavior is the _____.

A

FT schedule

39
Q

In a _____ schedule, reinforcement is contingenet on the continuous performance of a behavior for some period of time.

A

fixed duration

40
Q

_____ is an excellent schedule for producing a high rate of behavior.

A

DRH

41
Q

Of the following explanations, the one that is most satisfactory from the standpoint of science is ____.

A

Harry has stick-to-itiveness because his parents praised him for sticking with projects

42
Q

Operant lerning may also be referred to as _____.

A

instrumental learning

43
Q

All of the following are recognized kinds of reinforcers except ____.

A

classical

44
Q

Thorndike emphasized that we learn mainly from ____.

A

success

45
Q

T/F One everyday example of a VR schedule is the lottery.

A

True

46
Q

T/F When a response is plaed on exntinction, there is often an increae in emotional behavior.

A

True

47
Q

T/F One effect of the extinction procedure is an increase in the variability of behavior.

A

True

48
Q

T/F Unexpected reinforcers produce more dopamine than expected reinforcers.

A

true

49
Q

T/F Negative reinforcement and punishment are synonyms.

A

false

50
Q

Skinner generally used a free operant procedure in his research.

A

true