Schedules of Reinforcement Flashcards

1
Q

What is the difference between continuous and partial reinforcement?

A

Continuous reinforcement has:
1. more rapid learning
2. consequences easier to perceive
3. extinction more rapid

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Define the ratio (partial) reinforcement schedule

A

certain number of responses are reinforced

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Define the interval (partial) reinforcement schedule

A

some amount of time must elapse between reinforcements

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define the fixed (partial) reinforcement schedule

A

reinforcement occurs after a fixed number of responses or fixed time interval

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Define the variable (partial) reinforcement schedule

A

reinforcement occurs after an average number of responses or passage of time
*occurs randomly, so take an average

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

define a fixed ratio schedule (FR)

A

reinforcement given after a fixed number of responses
*every X responses, produces an outcome

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

define a variable ratio schedule (VR)

A

reinforcement given after a variable number of responses, centered around an average
*every X responses, produces an outcome, BUT X changes with each reinforcer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

define post-reinforcement pause

A

in an FR schedule, it is the time out from responding after each reward.
*the higher the ratio, the longer pause after each reward

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

provide examples of a fixed ratio (FR) schedule

A
  1. a rat must press a lever five times to obtain one food pellet
  2. factory workers who get paid a flat fee for every 100 pieces they turn out
  3. migrant farm workers who get paid a fixed amount for every bushel of apples picked
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

provide examples of a variable ratio (VR) schedule

A
  1. gambling / slot machines
  2. video games
  3. sports
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

define a fixed interval schedule (FI)

A

first correct response after a fixed time interval is reinforced
*after Y seconds, 1 response produces 1 outcome
*behavior before the interval expires has no consequence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

define a variable interval schedule (VI)

A

reinforcement given for first correct response after a variable time interval, centered around an average
*after Y seconds, 1 response produces 1 outcome, but Y changes after each outcome. behavior before the interval expires has no consequence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

in an FR schedule, what is the behavior?

A

steady responding until reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

in an VR schedule, what is the behavior?

A

constant and high rate of responding

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

in an FI schedule, what is the behavior?

A
  1. at beginning of interval, little/no responding
  2. increases to rapid rate of responding before interval expiration
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

in an VI schedule, what is the behavior?

A

steady but low rate of responding

17
Q

provide an example of a fixed interval (FI) schedule

A
  1. detention from 3 to 4. After arrives, little point in checking watch for first 15/20 mins. worth checking after he estimates 30/40 mins have passed, just in case time is flying by faster than he thinks. As the time gets closer to an hour, might check watch more frequently, not wanting to stay a moment longer than necessary.
  2. watching clock for appointment
18
Q

provide an example of a variable interval (VI) schedule

A

checking for email

19
Q

define the matching law of choice behavior

A

response rates to concurrent schedules often correspond to the rate of reinforcement for each schedule

20
Q

Why are VR and VI so different?

A

VR: more responses = more reinforcers (got to play to win!)
VI: more responses does not equal more reinforcers (only need to check in)

21
Q

Rank the 4 schedules from hardest to extinguish to the easiest

A

Variable ratio
Variable interval
Fixed interval
Fixed ratio