OPERANT CONDITIONING Flashcards

1
Q

Thorndike

A

Classical Conditioning: useful for understanding how certain stimuli automatically evoke reflexes and other relatively simple responses.

**Operant Conditioning: **identifies the factors responsible for the acquisition and maintenance of complex volunatry behaviors. Why we study for exams, how a child learns to ride a bike.

first by Thorndike and then by B.F. Skinner

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Thorndike cont….

A

Operant Conditioned Learning

Learning: connectionism or connections that develop between responses and stimuli as the result of trial-and-error.

  • behaviors were instrumental in helping the animals achieve a goal, he called this instrumental learning

Law of Effect:

any response that is followed by a satisfying state of affairs is likely to be repeated.

although receiving negative responses often has little to no effect in future behavior.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

B. F. Skinner

A

Skinner defined term Operant as:

  • “complex behaviors are voluntarily emitted or not emitted as the result of the way the behaviors ‘operate’ on the enviornment” or behaviors are created (conditioned) as the result of the consequences that follow the behaviors.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Reinforcement and Punishment

A

Environment provides organisms with a viriety of positive an negative consequences that cause them to either display or withhold the behaviors that preceded them.

Consequences:

Reinforcement or Punishment….behavior increases or decreases.

Positive and Negative…to provide or withhold/remove stimulus.

  1. Positive Reinforcement:
    1. cat pulls string and gets food…cat pulls string more…bx increases with application of reinforcer.
  2. **Negative Reinforcement: **
    1. pressing lever stops electric shock…increase bx of pressing lever to take away shock.
    2. behavior increases as a result of withdrawal of stimulus/reinforcer.
  3. **Positive Punishment: **
    1. application of a stimulus following a response decreases the behavior.
    2. hit dog for chewing shoes…i gave him a hit and he will stop chewing.
  4. Negative Punishment:
    1. removal of a stimulus follwoing a behavior decreases the behavior
    2. take away recess due to aggression, in order to decrease aggression.

Operant Strength

a. rate of responding during acquisition trials
b. total number of responses made during extinction trials

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

**Operant Extinction **

Behavioral Contrast

A

OPERANT extinction:

  • when reinforcement is consistently withheld from a previously reinforced behavior to decrease or eliminate the behavior (attention for crying).
  • withdrawal of a reinforcer does not usually cause an immediate cessation of the response
  • response disappears gradually after an initial phase in which responding is more variable and forceful
  • rat press bar for food, take away food, rat has an EXTINCTION BURST of increased bar pressing before stopping/decreasing.

Behavioral Contrast:

  1. subject reinforced for two different behaviors and reinforcement for one behavior is withdrawn (extinction), the other behavior is likely to increase.
  2. rat with two buttons, stop giving food for one button, he will press other one more. duh!
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Primary vs. Secondary Reinforcement

A

Primary (Unconditioned) Reinforcers:

  • inherently desirable and do not depend on experience to acquire their reinforcing value
  • food/water

**Secondary (Conditioned) Reinforcers: **

  • acquire their value only through repeated associateion with primary reinforcers
  • tokens, applause, gold stars
  • when secondary reinforcer is paired with several different primary reinforcers, it’s called generalized secondary reinforcer…money!
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Schedules of Reinforcement

Fixed interval

Variable Interval

Fixed Ratio

Variable Ratio

A

Continuous Schedule:

  • rate of acquisition of a behavior is fastest when reinforcement is presented after each response.
  • BUT satiation and extinction are also high for continuous schedules, best way to maintain the behavior is to switch to an intermittent (partial) schedule.

**Intermittent Schedules: **

FI (fixed interval): TIME:

  1. reinforcement provided on a fixed time schedule (pay every two weeks) (produces minimal levels of work)
  2. tend to produce low rates of responding since the number of responses is UNRELATED to the delivery of reinforcement.
  3. typically stop responding after a reinforcer is given, but responds again toward the end of the schedule…gives a ‘scallop’ graph of responses.

_VI (Variable Interval) TIME: _

  1. interval of time between delivery of reinforcers varies in an unpredictable manner from interval to interval.
  2. steady but relatively low rate of responses.
  3. like 5 pop quizzes over 10 weeks, but not evenly/routinely structured for time.

FR (Fixed Ratio)…Frequency/#:

  • reinforcer is delivered each time the subject makes a specific number of responses.
  • food for every 3rd press…explicit interval/ratio.
  • produces a relatively high, steady rate of responding.
  • Piecework (clothing) workers get pay following completion of a specific number of units done.

Variable Ratio (VR)….Frequency/#:

  • reinfocers are provided after a variable number of responses.
  • relationship between responses and reinforcement is unpredicatable!
  • VR give the highest rates of responding as well as well as responses that are most resistant to extinction. focus is on behavior!
  • gambling on slot machines!
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Matching Law

A

Matching Law:

  • concurrent schedules of reinforcement are used providing two or more simultaneous and independent schedules of reinforcement…each for a different response.
  • the correspondence between responding to two or more alternatives and the frequency of reinforcement for responding is predicted by the matching law.
  • the animal adapts responding (matches) based on relative frequency of both.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Superstitious Behavior

Stimulus Control

A

Superstitiousness:

  1. accidental, noncontingent reinforcement can lead to superstitious behavior.
  2. food pellet every 15 seconds for any behavior
  3. pigeon repeated behavior that they thought got them food!

Stimulus Control

  • if a response will be reinforced may be signaled by cues in the environment.
  • green light, peck key, food. red light, peck key, no food.
  • positive discriminative stimulus (SD) is green
  • negative discriminative stimulus (S-) S-Delta is red
  • when a behavior is affected by the presence of discriminative stimuli, it is said to be under stimulus control.
  • result of discrimination training (classical conditioning).
  • baby whines with dad (SD) who picks them up, but does not whine with mom (S-) who leaves him alone in crib when whining.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Stimulus & Response Generalization

A

as with classical conditioning…w/operant conditioning:

Stimulus Generalization:

  • similar stimuli elicit the same response
  • the stimuli that evoke the response are positive discriminative stimuli
  • animal pecks green/food, but also pecks with blue light.

Response Generalization:

  1. reinforcement of a response not only increases the occurrence of that specific response but also the frequency of similar responses.
  2. child says ‘dada’ with father gets lots of attention, so child starts producing more close approximations ‘baba’gaga’mama’ in the presense of father.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Escape and Avoidance Conditioning

A

Escape/Avoidance behaviors are maintained by negative reinforcement.

Escape Conditioning:

  • behavior increases because its performance allows the animal to escape a negative reinforcer.
  • rat may escape a shock applied to the floor by pressing lever.
  • lever pressing increases because it stops the shock

Avoidance Conditioning:

  1. two factor learning (like stimulus control)
  2. onset of negative reinforcer is preceded by a cue (positive discrimative stimulus) that signals that the negative reinforcer is about to come.
  3. rat learns that green light signals shock (classical conditioning), and it can avoid the shock by jumping over a hurdle (negative reinforcement)
  4. green light, jump hurdle, no shock!
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly