Instrumental Conditioning Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Instrumental Conditioning:

A

the learning of a contingency between behaviour and consequence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

instrumental conditioning involves explicit training between

A

voluntary behaviours and their consequences

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

a specific behaviour leads to

A

a specific consequence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

“stamping in” and “stamping out” determines

A

whether a behaviour was maintained or eliminated respectively

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Edward L. Thorndike and his puzzle box what behaviours were stamped in

A

Behaviours like rope pulling were stamped in because they were followed by the favourable consequence of access to food

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Edward L. Thorndike and his puzzle box what behaviours were stamped out

A

random behaviours like turning in a circle, were stamped out

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

“stamping in” and “stamping out” leads to in regards to Thorndike puzzle box

A

process leads to refinement, and the cat learns the contingency between the specific behaviour of rope pulling and
the specific consequence of food reward

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Law of Effect:

A

Behaviours with positive consequences are stamped in and produced more frequently

Behaviours with negative consequences are stamped out and produced less frequently

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

4 different types of instrumental conditioning

A
  1. Presenting a positive reinforcer
  2. Removing a positive reinforcer
  3. Presenting a negative reinforcer
  4. Removing a negative reinforcer
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Reward Training

A

the presentation of a positive reinforcer following a response which increase the frequency of the behaviour

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

An example of reward training

A

if you present your puppy with a treat every time he sits on command, the behaviour is likely to increase

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Punishment Training:

A

the presentation of a negative reinforcer following a response that decreases the frequency of the behaviour

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

An example of punishment training is:

A

if little Billy teases his sister, and his mother tugs his ear and scolds him, he will likely decrease the behaviour

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Omission Training

A

removing a positive reinforcer following a response that decreases the behaviour

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

An Example of omission training is:

A

little Billy is doing his 2 favourite things, watching his favourite TV show and teasing his sister

  • Billy’s mom wants to eliminate the teasing behaviour
  • she decides to turn of the TV for 30 seconds every time Billy teases Sally
  • access to the TV show is a positive reinforcer and removing it, will likely cause Billy to stop his teasing behaviour
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Escape Training:

A

removing a negative reinforcer following a response that increases the behaviour

a constant negative reinforcer being presented that the learner is motivated to have remove

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

An example of escape training is:

A

ex. the floor of one side of a rats cage delivers a constant mild electric shock, it can be avoided if the rat moves to the opposite side of the cage

18
Q

Instrumental conditioning is that is proceeds best when (timing)

A

he consequence immediately follows the response

19
Q

Acquisition

A

When an organisms learns the contingency between a response and its consequence

20
Q

auto shaping is :

A

For simple behaviours

and can be learned without the careful guidance of the researcher

21
Q

shaping by excessive

approximation:

A

the complex behaviour can be organized into smaller steps which gradually build up to the full response that we hope to condition

22
Q

Discriminative Stimulus (SD/S+):

A

signals when a contingency between a particular response and reinforcement is on

23
Q

S∂ (S-):

A

a cue which indicates when the contingent relationship is not valid

24
Q

An example of Discriminative Stimulus (SD/S+):

A

ex. the environment of the childs parents home becomes an SD for the response of vegetable eating behaviour which is reinforced with access to a dessert reward

25
Q

An example of S∂ (S-):

A

ex. the environment of the grandparents house becomes an S∂ for the response of vegetable eating

26
Q

SD (S+ )and S∂ (S-)

A

are cues that predict wether or not the contingent relationship is valid or not

27
Q

In contrast to classical conditioning the CS is paired with a US to elicit a response The SD itself does not

A

elicit the response… the SD sets the occasion for a response by signalling when the response reinforcer outcome relationship is valid

28
Q

Continuous Reinforcement:

A

in all the above examples, a response leads to a reinforcer on every trial

29
Q

Partial Reinforcement:

A

can have reinforcement delivery

determined by either the total responses or time

30
Q

4 basic schedules of reinforcement

A
Fixed Ratio (FR-#)
Variable Ratio (VR-#)
Fixed interval ( FI-#)
Variable interval (VI-#)
31
Q

Ratio Schedules

- Fixed Ratio or Variable Ratio

A

Based on the # of responses made by a

subject which determines when reinforcement is given

32
Q

Interval Schedules

Fixed Interval or Variable interval

A

based on the times since the last response

that was reinforced

33
Q

FI- 10 Min Schedule

A

ex. a pigeon on a FI-10 min schedule, is rewarded with
food for the first pecking response after a 10 minute
period
- over an hour, the pigeon has the potential to only earn 6
food pellets

34
Q

Fixed Ratio

A

A fixed-ratio schedule of reinforcement means that reinforcement should be delivered after a constant or “fixed” number of correct responses

subjects on this a fixed ratio schedule display a “pause and run” pattern

35
Q

Variable-Ratio Schedule (VR)

A

When using a variable-ratio (VR) schedule of reinforcement the delivery of reinforcement will “vary” but must average out at a specific number.

  • ex. the reinforcement you may receive by playing a slot machine in a casino
36
Q

Fixed-Interval Schedule (FI)

A

A fixed-interval schedule means that reinforcement becomes available after a specific period of time.

37
Q

Variable-Interval Schedule (VI)

A

The variable-interval (VI) schedule of reinforcement means the time periods that must pass before reinforcement becomes available will “vary” but must average out at a specific time interval.

38
Q

the slope of a variable ratio schedule’s will look like…

A

may look like a diagonal line with no pauses between

- Reflects the average # of responses required before reinforcement is delivered

39
Q

An example of Fixed-Interval Schedule (FI)

A

a course with weekly quizzes
- this will mean that study behaviour responses will start
ramping up just before the quiz

40
Q

Fixed-Interval Schedule (FI) will look like…

A

following reinforcement, there is a lull period in which
responding drops then slowly starts picking up again and peaking just before the next reinforcement is scheduled to be delivered following a response

41
Q

Variable-Interval Schedule (VI) will look like…

A

is shown as a straight line on the cumulative record