Lecture 5 Flashcards

1
Q

A response that must be met to obtain reinforcement.

What is this?

A

Schedule of reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What type of reinforcement is this?

Each individual response in reinforced, each lever pressed is rewarded.

A

Continuous reinforcement schedule

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What type schedule of reinforcement lies between continuous reinforcement and extinction.

E.g. A slot machine where you only win some of the time

A

Intermittent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What type of reinforcement schedule doesn’t offer reinforcements?

What is the consequence of it?

A

Extinction; decrease the likelihood of the behavior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are the two kinds of intermittent reinforcement schedules?

Give examples with a slot machine.

A

Interval schedules = When (time)

E.g. a slot machine that pays every 10 mins

Ratio schedules = How often (number)

E.g. A slot machine that pays every 100 pulls

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a fixed ratio reinforcement? What is the formula?
What does this type of schedule produce in terms of responding?

What do higher ratios produce?

Give an example.

5pts

A
  • A fixed number of responses is required for each reinforcement
  • FRn
    n= number of responses required
  • Produces rapid rates of responding
  • Higher ratios lead to longer post-reinforcement pauses

Ex- A kid gets paid $1 for every 5 newspapers delivered (FR5)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is a variable ratio schedule?

What does this type of schedule produce in terms of responding?

Is there a post-reinforcement pause?

Give an example.

5pts

A
  • varying, unpredictable number of responses
  • Reinforcement depends an average number of responses
  • Rate of response is high and steady
  • little to no post-reinforcement pause
  • Dating apps and gambling
  • On a variable ratio 5 (VR10) schedule, a rat has to do an average of 10 lever presses for each food pellet
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Reinforcement is contingent upon the
first response after a fixed, predictable
period of time.

What type of reinforcement schedule is this?

Give an example.

A

Fixed interval reinforcement schedule.

FI- 60mins: a patient on a morphine drip will only receive morphine when pressing the dispenser after 60 mins has elapsed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Reinforcement is contingent upon
the first response after a varying,
unpredictable (average) period of
time.

Produce a moderate, steady rate of response with little or no post-reinforcement pause.

What type of reinforcement schedule is this?

Give an example.

A

Variable interval schedules

Ex:
Trial 1: Lever press, 5 sec pause, reward

Trial 2: Lever press, 12 sec pause

Trial 3: Lever press, 7 sec pause, reward etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Reinforcement is contingent on performing a
behavior continuously throughout a period of time.

What type of schedule is this?

A

Duration schedule

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the difference between duration and interval schedules?

A

Duration: Requires continuous responding

Interval schedules: Requires a certain amount of time to pass before the response is rewarded

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is differential reinforcement of high and low rates (otherwise known as response rate schedules)?

What does it depend on?

Give examples.

A
  • Depends on the organisms rate of response
  • High rate: high rate of response is reinforced
    Ex- winning a race
  • low rate: low rate of response is reinforced
    Ex- being praised for eating your food slowly
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q
  • The reinforcer is delivered independently of any response.
  • A response is not required for the reinforcer to be
    obtained.
  • ‘Free’ reinforcer

What type of schedule is this? What are the 2 types.

Give an example example.

A

Non-contingent schedule –> Fixed time and variable time

  • FT (fixed time): A behavior is reinforced after a fixed period of time regardless of its behavior
  • Ex: Receiving a gift on your birthday every year
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Non-contingent reinforcement may account for
some forms of…

  • Behaviors may be accidentally reinforced by the
    coincidental presentation of reinforcement

What type of behavior is this?

A

Superstitious behavior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

A fox runs after a rabbit in the hopes of securing
his dinner. However, he doesn’t always succeed.

What type of reinforcement schedule is involved?

A

Response rate with variable ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

You have to sit in the waiting room for approximately
60 minutes before your doctor’s appointment.

a. Intermittent VI60
b. Intermittent VR60
c. Continuous
d. Response rate

A

a. intermittent VI60

17
Q

Forever 21 has its annual sale at the same time every year:

(A)Fixed ratio
(B)Fixed interval
(C) Variable ratio
(D) Variable interval

A

b. fixed interval

18
Q

Overview:

What are the 4 intermittent schedules?

What are the 3 simple schedules?

What are the 3 complex schedules?

A

Fixed interval, variable interval, fixed ratio, variable ratio

Duration, response rate, non-contingent

Conjunctive, adjusting, chained

19
Q

A type of complex schedule in which
the requirements of two or more
simple schedules must be met before
a reinforcer is delivered.

Wat type of schedule is this?

What type of schedule does it require ?

Give an example.

3pts

A

Conjunctive schedule

In order to get paid: You have to work a certain amount of hours a week (FD-40hrs) and you have to deliver a certain number of papers (FR 50).

  • Requires both a ratio and an interval schedule
20
Q
  • A sequence of two or more simple schedules, each of which has its own SD and the last of which results in a terminal reinforcer.
  • Must be completed in a particular order

What type of complex schedule is this?

Give an example.

A

Chained schedule

Paying for items at the grocery store.

  1. Line up at the cash
    –> variable Interval
  2. Wait for the prompt on debit machine
    –> fixed interval
21
Q

An increase in the strength and/or efficiency of responding as one draws near to the goal.

What concept is this?

A

Goal gradient effect

22
Q
  • Training the final link first and the initial link last,
    in order to make the chain more effective.
  • The sight of each stimulus is both a secondary
    reinforcer for the previous behavior and a
    discriminative stimulus for the next behavior.

What type of chaining is this?

A

Backward chaining

23
Q

An event is reinforcing to the extent that it is
associated with a reduction in some type of
physiological drive.

What theory is this?

What is a limitation?

A

Drive reduction theory

Not all behaviors appear associated with reduction in a physiological drive.

24
Q

What is the Premack Principle?

What does it focus on?

Give an example.

3pts

A

A high-probability behavior (the reinforcer) can be used to reinforce a low probability behavior.

  • Focus is on how a reinforcer can increase the future likelihood of a behavior

Example: Lever pressing (low probability behavior) reinforced by eating food (high probability behavior)

25
Q

What is the response deprivation hypothesis?

Explain what is happening in this example:

  • If given free access, Noah might play on his iPhone for 3-hrs per night.
  • If his access to his phone is restricted to only 15 minutes per day, it he will be unable to reach his preferred level

4pts

A

A behavior can serve as a reinforcer when: 1. access to the behavior is restricted and
2. its frequency falls below the preferred level of occurrence

  • Noah will be in a state of deprivation with regard to playing on his phone
  • He will now be wiling to work to obtain additional time on his phone
26
Q

For variable intervals/ratios, what does the word variable imply?

A
  • An approximate, an average
27
Q

What does an adjusting schedule depend on?

A

Performance