Chap 8-10 Flashcards

1
Q

What is the principle of operant extinction ?

A

If an individual demonstrates a previous reinforced behavior that is no longer reinforced, then that person is unlikely to produce the same behavior when encountering a similar situation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How does the environment play a role in extinction ?

A
  1. We must control the availability of reinforcers in the environment and we should pick an environment that maximizes our chances for sucess
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How can extinction be used to generate desired behaviors?

A

The procedure of extinction can be paired with the reinforcement of an alternative desirable behavior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is continuous reinforcement? What is Intermittent reinforcement?

A

Continuous: Reinforcement is given after each instance of the target behavior
Intermittent: Reinforcement is given only occasionally following the target behavior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is an extinction burst? Can these be avoided?

A

There may be a temporary increase in the undesirable behavior before the behavior ceases
Pairing extinction with the positive reinforcement of an alternative behavior decreases the likelihood of an extinction burst

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is spontaneous recovery ?

A

Spontaneous recovery occurs when the extinct behavior reappears following a break, using additional extinction sessions can resolve this problem

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are some concerns when using extinction ?

A
  1. Sometimes extinction of desirable behaviors occur

2. If we forget to reinforce a desirable behavior, these behaviors cease to be maintained

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is shaping?

A

Using reinforcement for successive approximations of a behavior while also introducing extinction of earlier approximations as progression occurs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Give an example of when shaping might be used

A
  1. A person training for a marathon might use shaping to reinforce each milestone of running. Reinforcing 1 mile, then gradually increasing to 10 miles….15 miles etc
  2. Shaping is often used by parents when teaching their babies to say certain words, first they reinforce “ba” then ““baba” then “balloon”
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are factors that influence shaping?

A
  1. Specify the target behavior
  2. Choose and appropriate starting behavior
  3. Choose appropriate successive steps to reach the target behavior
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are the benefits of intermittent over continuous reinforcement?

A
  1. The reinforcement remains effective longer since it occurs over a longer period of time
  2. Behaviors take longer to extinguish since reinforcement is not immediate every time
  3. The target behavior is more likely to persist in a natural environment
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is a fixed-ratio schedule? Give an example

A

Reinforcement is given after a fixed set of responses is emitted
For example: Earn $2 for every 50 flyers handed out, earn a sticker for every 2 math problems that are completed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is a variable-ratio schedule? Give an example

A

Reinforcement is given after a certain number of responses that changes after from reinforcement to the next
For example: using a slot machine at a casino

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are the main differences between fixed and variable ratio schedules?

A
  1. fixed ratio schedules cannot be increased as nicely as a variable ratio without producing a ratio strain
  2. Responses can be maintained at a higher value in a variable ratio in comparison to a fixed ratio
  3. Variable ratio is more resistant to extinction
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is a fixed-interval schedule? Give an example

A

Reinforcement occurs after a set amount of time that remains constant
For example: getting the mail at 10 am every day, watching a new episode of your favorite show that only airs at 6pm on Thursdays

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is a variable- interval schedule? Give an example

A

Reinforcement occurs after the first repsonse in a period of time, where the length of time between reinforcement changes
For example: Checking your emails, when you go fishing-the amount of times you are able to catch a fish

17
Q

Give an example of a limited hold on a fixed- ratio schedule

A

A child may be reinforced after every 2 math questions they complete in a 5 minute time span

18
Q

Give an example of a limited hold on a fixed and ratio-interval schedule

A

Fixed-interval: Catching the bus every day at 11 am, where the bus leaves at 11:05. there is a deadline to get on the bus in order to receive reinforcement.
Variable-interval: Waiting to eat a fruit when it is ripe, the time until it is rip will change every time, and there is only a certain amount of time before it is over-ripened

19
Q

What is a fixed-duration schedule? Give an example

A

Reinforcement is given if the behavior is maintained during the entire fixed period of time
For example: working for an hourly rate, you must work the entire hour to get paid

20
Q

What is a variable-interval schedule? Give an example

A

Reinforcement is given if the behavior is maintained during a period of time that can change in length
For example: a wild life photographer capturing a picture of a bird, he must wait the entire time until the bird shows up to get a picture, but the time waiting is unpredictable