Reinforcement and Schedules of Reinforcement Flashcards

1
Q

What is operant conditioning?

A

A type of learning in which the future probability of a behaviour is affected by its consequences

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the three main components of operant conditioning?

A

antecedent cues, behaviour, and consequences.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the order that ABC occurs?

A

antecedent cues (A) come before a behaviour (B); the consequences (C) occur after the behaviour.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

what does the arrow between the B and C demonstrate?

A

The arrow between B and C indicates that the behaviour causes the consequences.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does the colon between A and B demonstrate?

A

The colon between A and B indicates that antecedent cues do not cause behaviour, but they merely set the occasion for behaviour.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

During operant conditioning, what do the consequences of behaviour influence?

A

(1) Frequency of the behaviour in the future
(2) Ability of future antecedent cues to set the occasion for the behaviour.
Means that C influences A and B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the prime movers of operant conditioning?

A

The consequences following an operant behaviour

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are reinforcers?

A

consequences used to strengthen a behaviour

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How do we make a behaviour reoccur in the future?

A

To cause a behaviour (B) to be repeated in the future, a reinforcer must be immediately contingent (dependent) on the execution of the behaviour.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are the four types of contingency?

A

Positive punishment
Positive reinforcement
Negative punishment
Negative reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is reinforcement?

A

Reinforcement is defined as the procedure by which certain consequences strengthen a behaviour.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the difference between a reinforcer and a reinforcement?

A

The term reinforcer refers to the actual consequence
of a behaviour; the term reinforcement refers to the
process or procedure of strengthening a behaviour by
instituting this consequence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is positive reinforcement?

A

a behaviour strengthening procedure in which the occurrence of a behaviour is followed by the presentation of a stimulus that is usually considered pleasant or rewarding.
e.g.The use of food to increase the
strength of lever pressing in rats.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is Negative reinforcement?

A

Behaviour strengthening procedure in which a stimulus
that is usually considered unpleasant or aversive, is
removed or omitted if the behaviour occurs.
If fastening one’s seatbelt terminates the loud buzz that
occurs in some cars when the ignition is turned on, this
behaviour will be strengthened.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is a reinforcement schedule?

A

a rule that states under what conditions a reinforcer will be delivered.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is a continuous reinforcer?

A

Where every occurrence of the operant response is followed by a reinforcer. This schedule is called continuous reinforcement (CRF).

17
Q

What is a partial or intermittent reinforcement?

A

Where only some of the occurrences of the response are reinforced

18
Q

According to a continuous reinforcement, every time a rat hits the bar what should happen?

A

A food pellet should be delivered

19
Q

Under intermittent (or partial) reinforcement schedules, the rat:

A

May be required to hit the bar 50 times to get the pellet
2.
Is reinforced only once every five minutes

20
Q

What is the cumulative recorder?

A

A classic device that provides an easily read, graphic depiction of changes in the organisms rate of response over time…

21
Q

What are the four simple reinforcement schedules?

A

Fixed ratio, variable ratio, fixed interval, variable interval

22
Q

What is fixed ratio?

A

A reinforcer is delivered after every n responses, where n is the size of the ratio. Responding exhibits a ‘stop-and-go’ pattern (a high rate of responding along with a short pause following the attainment of each reinforcer… postreinforcement pause).
Eventually this pause gives way to an abrupt continuation of responding.
e.g. is the “piecework” method used to pay factory workers in some companies.

23
Q

What is variable ratio?

A

The number of required responses is not constant from reinforcer to reinforcer. On average, a subject will receive one reinforcer for every n responses, but the exact number of reponses required at any moment may vary widely. The pattern of responding might be described as rapid and fairly steady.

With VR schedules of modest length, regular post-reinforcement pauses are found. Yet pauses on VR schedules are several times smaller than those found on FR schedules.

24
Q

Why are the pauses on VR schedules much smaller than FR schedules?

A

Many forms of gambling are examples of VR schedules:

(1) A person’s chances of winning are directly proportional to the number of times the person plays
(2) the number of responses required for the next reinforcer is uncertain

25
Q

What are fixed intervals?

A

In all interval schedules, the presentation of a reinforcer depends both on the subject’s behavior and on the passage of time. The first response after a fixed amount of time has elapsed is reinforced. After a fixed period of time has elapsed a reinforcer is “stored”, and the next response will produce the reinforcer.

As on FR schedules, there is a post-reinforcement pause, but after this pause the subject starts by responding quite slowly… As the interval progresses, the animal responds more and more rapidly, and just before reinforcement the response rate is quite rapid.

26
Q

What is an example of a Fixed Ratio schedule?

A

Example: imagine that as you are walking to the bus stop you see a bus leave…you sit and wait for the next bus that you know will arrive in another 20 minutes, but you don’t have a watch.
The operant response is looking down the street for the next bus and the reinforcer for this response is simply the sight of the next bus.

27
Q

What is a variable interval?

A

The first response to occur after a reinforcer is
stored collects that reinforcer, and the clock does not start again until the reinforcer is collected.
The amount of time that must pass before a reinforcer is stored varies unpredictably from reinforcer to reinforcer. VI schedules typically produce a steady, moderate response rate, often with little or no postreinforcement pause.

28
Q

What is an example of a variable interval?

A

If you need to contact your lecturer with a last minute assignment question and you know that she always arrives in her office between 12pm and 1230pm (I know!), a good strategy would be to phone every few minutes.

29
Q

What is the most important factor influencing resistance to extinction?

A

The schedule of reinforcement

30
Q

When is extinction more rapid?

A

after continuous reinforcement than after a schedule of intermittent or partial reinforcement.

31
Q

What is the partial reinforcement effect?

A

That partial reinforcement during training increases responding during extinction

32
Q

What was the study by Lewis and Duncan?

A

college students were given the opportunity to play a slot machine…they were told they could play as long as they wished and that each time they won they would earn five cents.
The percentage of reinforcement was varied across groups in the first phase, and in the second phase reinforcement was discontinued and the experimenters monitored how long subjects continued to play.

33
Q

What were the results of the study by Lewis and Duncan?

A

The lower the percentage of reinforcement college students received for playing a slot machine during training, the longer they persisted in playing during extinction

34
Q

For a subject to perform a response to obtain a reinforcer depends on…?

A
  1. The belief that the response would produce the reinforcer (learning)
  2. Whether the subject actually wants the reinforcer (motivation)