Exam 2 Flashcards

1
Q

Direct effect of schedule

A

The contingency which the schedule specifies. E.g., one lever press to get food. X behavior must occur for a reinforcer to be produced

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Side effect of schedule

A

A characteristic of behavior that is not necessary for the reinforcer to be produced.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Response based schedule

A

Only x responses are required for reinforcement (fixed/variable ratio schedule).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Higher ‘n’ in FR leads to..

A

Higher response rate (?) and longer pause (break and run pattern).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

VR 30

A

A variable ratio schedule with mean of n of 30

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Direct and side effects of VR

A

Direct: mean number of responses must occur. Side: constant, high rate of responding.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Fixed interval

A

Once a certain time has passed, the first response that occurs is reinforced.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

FI 3’

A

Fixed interval that starts reinforcing after three minutes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Directed and side effects of FI

A

Direct - obvious. Side effect: Peak based pattern -

  • Little responding at beginning of interval
  • Los of responding at end of interval

Example: checking watch while waiting for a bus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Side effect of a FI

A

Scallop

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What happens with larger VR rates?

A

The response increases (?)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Variable Interval

A

Direct effect: the first response after a varied amount of time is reinforced.

Side effect: low, constant response rate.

Example: pop quizzes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Order of steepest schedules

A

Most to least:

  1. VR
  2. FR
  3. FI
  4. VI
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What does a longer VI do?

A

Lowers the response rate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Fixed time

A

Not to be confused with FI. But FT is time based only - it does not depend on a response.

Direct effect - nothing
Side effect - adjunct behavior and a decrease in behavior that was previously reinforced.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Non contingent reinforcement

A

NCR - same as fixed time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Differential reinforcement of high rate (DRH)

A

X responses must occur within time t. Used to increase behavior and make it more rapid. Examples: encourage kid eating quicker. Or need to talk quickly to get your thought across.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Differential reinforcement of low rates (DRL)

A

IRT > some time t. Respond before a certain time and it resets. Used to decrease behaviors but completely eliminate (e.g., rapid eating).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Inter-response time (IRT)

A

Time between responses

20
Q

Differential reinforcement of other behavior (DRO)

A

Behavior incompatible (lifting weights instead of hitting but not something that can be done simultaneously) with x must occur for reinforcer to be delivered. Used to completely decrease behavior (aggression).

21
Q

Multiple vs concurrent vs conjoint schedules

A

Multiple - There is a series of schedules (in series rather than in parallel).

Concurrent - there are schedules that happen in parallel and we can choose between being reinforced by many different schedules. I.e., schedules compete with each other.

Conjoint - a overall schedule that is composed of other schedules.

22
Q

Extinction burst

A

The burst that happens after the reinforcement stops

23
Q

Extinction effect

A
  1. Extinction burst
  2. Variability in behavior and topography (qualitative features such as pounding vs touching lightly) increases
  3. Force of operant previously reinforced increases

Seems like all instances of aggression but actually each is a separate aspect of behavior.

  1. Behavior under extinction eventually decreases
24
Q

Is it true that the more consistently you reinforce behavior, the stronger it becomes?

A

No, varied creates behavior that is harder to make extinct.

25
Q

Partial reinforcement effect

A

Inconsistently reinforced behavior is harder to make extinct.

26
Q

Aversive

A

Subjective: uncomfortable, painful. Determined by self-report.

Objective: something that is avoided (behaviorally determined).

27
Q

Natural aversive

A

The first time you come into contact with it, you have an aversive response. E.g., touching a hot stove. (A good litmus test is would a baby respond to it.)

  • Harmful
  • “Annoying”
  • Thresholds are individuals

Not:

  • Spiders
  • Learned aversives
28
Q

What do thresholds of aversives depend on?

A
  • The individual

* Experience (prenatal, e.g.)

29
Q

Habituation

A

Decrease response due to repetition

30
Q

Sensitization

A

Increase response due to repetition

31
Q

What can change responses to aversives?

A

Habituation, sensitization, reinforced self-reports (parents giving child attention when they fall)

32
Q

Learned aversives

A

Neutral stimulus is paired with naturally aversive stimulus (conditioned aversive stimulus).

  • Sight of an annoying neighbor
  • Little kids at the movie (just the sight)
33
Q

Punishment

A

Decreases behavior

34
Q

Requirements for punishment

A

A response has to occur (there must be earlier reinforcement). Therefore, all punishment schedules are competing against reinforcement schedules.

35
Q

When is a punisher most effective

A
  • Immediate - delayed punishers are less effective
  • Intense - if introduced gradually in intensity, behavior becomes “tolerant”; punishment is more effective at full intensity.
  • Continuous schedule of punishment is more effective than intermittent schedule
  • Reducing effect of competing reinforcer
  • Verbally specified (for humans)
  • An appropriate way to obtain the reinforcer that is maintaining behavior
36
Q

When is punishment less effective

A
  • When there are competing reinforcing contingencies
37
Q

Two types of negative reinforcement

A
  1. Escape
    S-: behavior results in removal of aversive (e.g., hitting snooze, taking pain-relieving meds, feeding whining dog, stopping the complaining when roommate does dishes)
  2. Avoidance : behavior prevents aversive stimulus from occurring (no occasioning stimulus and no removal of that stimulus). The response postpones an aversive experience.
    * Examples:
    • Taking a daily preventative medication right before the dose wears off
    • Brushing teeth to avoid getting cavities
38
Q

Sidman avoidance

A

Dog can avoid every shock even with no CS

39
Q

What does a short time between response and shock cause (short response shock interval)

A

Higher rate of jumping

40
Q

One-factor theory

A

Behavior is operant. The one factor is simply the reduction of aversive events. Time is the “stimulus” that occasions an avoidance response.

41
Q

Two-factor theory

A

Operant and classical. 1. Fear is classically conditioned to the passage of time and 2. the organism behaves to escape fear/anxiety/discomfort (escape).

Avoiding the shot is part of it but it isn’t sufficient.

42
Q

Immediate effects of punishment

A
  1. Punisher is reinforced
  2. Punisher becomes an aversive
  3. Generalization to other physically similar stimuli
  4. Aggression (displaced aggression, counter-control)
  5. Withdrawl
  6. Suicide (extreme withdrawl)
43
Q

Displaced aggression

A

When one has no control over the aversive, they might channel aggression towards another object or organism. Examples:

  • Bad grade leads to yelling at roommate
  • Verbal abuse because of low SES
44
Q

Counter-control

A

When there is no control over aversive, aggression can occur to stop the source of the punishing consequences. Examples:

  • Rebellion against controlling parents
  • Fighting back against bully
45
Q

Non-standard aversives

A

Inequality (unequal outcomes for equal effort). Leads to less productivity, more violence, more risk of depression, anxiety, suicide.