Principles of Operant Conditioning Flashcards

1
Q

The principles of operant conditioning were first described by __________________ and subsequently expanded upon by __________________.

A
  • Edward Thorndike
  • B.F. Skinner
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

______________ is best known for his studies involving placing hungry cats in “puzzle boxes” that required them to make a particular response to excape the box and obtain food.

A

Thorndike.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Thorndike’s observations led him to conclude that learning is due to the connections that develop between responses and stimuli as a result of ______________________. He referred to this phenomenon as _____________________.

A
  • Trial-and-error
  • Instrumental learning
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Thorndike developed several laws of learning; the most important of these - _____________________ - states that any response that is followed by a “satisfying state of affairs” is likely to be repeated.

A

Law of Effect.

Bonus: He originally postulated that responses followed by an “annoying state of affairs” would be less likely to occur, but ultimately found that evidence did not support this assumption.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

____________________ believed that most complex behaviors are voluntarily emitted or not emitted as the result of the way they “operate” on the environment (i.e., as the result of the consequences that follow them); he referred to this as __________________.

A
  • B. F. Skinner
  • Operant conditioning
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

For Skinner, positive and negative do not refer to “good” or “bad;” positive refers to the _______________ of a stimulus, and negative refers to ________________ or ______________ a stimulus.

A
  • Application
  • Removing
  • Withholding
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Fill in the blanks:

_________________ = Stimulus Applied > Behavior Increases

_________________ = Stimulus Applied > Behavior Decreases

_________________ = Stimulus Removed > Behavior Increases

_________________ = Stimulus Removed > Behavior Decreases

A
  • Positive Reinforcement
  • Positive Punishment
  • Negative Reinforcement
  • Negative Punishment
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

By definition, _______________ increase the behavior it follows.

A

Reinforcement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

_______________ naturally decreases the behavior it follows.

A

Punishment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

__________________ occurs when reinforcement is consistently withheld from a previously reinforced behavior to decrease or eliminate that behavior.

A

Operant extinction.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

If a rat has been reinforced for bar-pressing, sudden withdrawal of reinforcement will initially cause the rat to bar-press more than usual before bar-pressing begins to decline; this is referred to as an _____________________.

A

Extinction (response) burst.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

_____________________ : When a subject has been reinforced for two different behaviors and reinforcement for one behavior is wiithdrawn in order to extinguish it, the other behavior is likely to increase.

A

Behavioral contrast.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

_________________ (unconditioned) reinforcers are inherently desirable and do not depend on experience to acquire their reinforcing value.

A

Primary.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

_____________________ (conditioned) reinforcers acquire their value only through repeated association with primary reinforcers.

A

Secondary.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

When a secondary reinforcer is paired with several different primary reinforcers, it’s called a _________________. Example: Money.

A

Generalized secondary reinforcer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

In general, the rate of acquisition of a behavior is fastest when the behavir is reinforced on a __________________; that is, when reinforcement is presented after each response.

A

Continuous schedule.

17
Q

Because satiation and rate of extinction are also high on a continuous schedule, once an operant behavior has been acquired, the best way to maintain the behavior is to switch to an _________________ schedule.

A

Intermittent (partial).

18
Q

Skinner distinguishes between ___ intermittent schedules.

A

4.

19
Q

_____________________: Reinforcement is delivered after a fixed period of time regardless of the number of responses made. They tend to produce low rates of responding since the number of responses is unrelated to the delivery of reinforcement.

A

Fixed-interval (FI).

20
Q

________________________: The interval of time between delivery of reinforcers varies in an unpredicatable manner from interval to interval. This schedule produces a steady but relatively low rate of response.

A

Variable Interval (VI).

21
Q

___________________: A reinforcer is delivered each time the subject makes a specific number of responses. This schedule produces a relatively high, steady rate of responding, usually with a brief pause following delivery of the reinforcer.

A

Fixed ratio (FR).

22
Q

____________________: Reinforcers are provided after a variable number of responses. Because the relationship between responding and reinforcement is unpredictable, these schedules produce the highest rates of responding as well as responses that are most resistant to extinction. Example: Slot machines.

A

Variable ratio (VR).

23
Q

The correspondence between responding to two or more alternatives and the frequency of reinforcement of responding is predicted by the ___________________. If a rat is delivered reinforcement on a VI-30 schedule on one lever and a VI-60 schedule on the other, he will press the VI-30 lever twice as often.

A

Matching law.

24
Q

Skinner found that accidental, noncontingent reinforcement can lead to ____________________.

A

Superstitious behavior.

25
Q

A pigeon is reinforced for pecking a key when a green light is on, but not when a red light is on. The green light is the _________________ (SD) - it signals that reinforcement will occur as the consequence of a given response. The red light is a _________________, or S-delta stimulus (S-) - it signals that the response will not be reinforced.

A
  • Positive discriminative stimulus.
  • Negative discriminative stimulus.
26
Q

When the occurrence of a behavior is affected by the presence of discriminative stimuli, the behavior is said to be under ____________________, an example of two-factor learning. Performance of the target behavior is due to operant conditioning; performance of the behavior in the presence of the positive discriminative stimulus (but not the negative discriminative stimulus) is the result of discrimination training (classical conditioning).

A

Stimulus Control.

27
Q

In operant conditioning, the stimuli that evoke response generalization are the _____________________.

A

Positive discriminative stimuli.

28
Q

Escape and avoidance behaviors are behaviors that are maintained by __________________.

A

Negative reinforcement.

29
Q

_______________ Conditioning: A behavior increases because its performance allows the organism to __________ an undesirable (aversive) stimulus.

A

Escape.

30
Q

______________ Conditioning: The result of two-factor learning, ____________ conditioning; the onset of the negative reinforcer is preceded by a cue (positive discriminative stimulus) that signals that the negative reinforcer is about to be applied.

A

Avoidance.