Exam 2 Flashcards

1
Q

Positive Reinforcement

A

An instrumental response increases likelihood as a result of a stimulus being presented following the response and not presented in the absence of the response

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Negative Reinforcement

A

If the instrumental response is performed, an aversive stimulus is terminated or prevented from occurring.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Omission

A

Instrumental response prevents delivery of a pleasant or appettitive stimulus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Punishment

A

Occurrence of the instrumental response results in delivery of an aversive stimulus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Magazine Training

A

Classical Conditioning (sign tracking): Subject’s familiarization with the mechanism that delivers the reinforcer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Shaping

A

Development of a new response through positive reinforcement of successive approximations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Discrete Trial Procedures

A

Operant response is performed once an animal is removed from the apparatus. Conditioning measured by speed and latency to leave start boxes. ex) Mazes - Runway, T-maze, radial arm maze, Morris water maze.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Free-Operant Procedures

A

Animal can repeat instrumental behavior over and over again. Conditioning measured by frequency of operant response. ex) Skinner box.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Fixed Ratio Schedule

A

X number of responses to get a consequence. Easier to extinguish. ex) Rewards Card

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Variable Ratio Schedule

A

Random number of responses to get consequence. Harder to extinguish. ex) Gambling

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Fixed Interval Schedule

A

A set amount of time has to pass in order to gain reinforcement. Responding increases as the interval grows closer. Easier to extinguish. ex) College students studying

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Variable Interval Schedule

A

A random amount of time passes in order to gain reinforcement. Responding is steadier than in a fixed interval schedule. Harder to extinguish. ex) Waiting for an elevator

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Best conditions for Operant Conditioning

A

Reinforcement is swift, certain, awesome/severe, and has belongingness.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Learned Helplessness

A

A sense of powerlessness gained from a persistent failure to succeed or a traumatic event.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Depression

A

Associated with learned helplessness

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Procrastination

A

Putting things off which progressively leads to more and more anxiety building up

17
Q

Internal Locus of Control

A

Outcomes occur due to your own effects

18
Q

External Locus of Control

A

Outcomes are outside of the individual’s control. More likely to use emotion-focused coping strategies.

19
Q

Control

A

When people don’t feel like they have control, they stop trying. ex) poverty, depression, combat, and procrastination.

20
Q

Concurrent Schedule

A

Subject chooses to respond to one of two keys, each with a different schedule. ex) Variable interval and Fixed Ration 10 then measure how animal distributes their pecks.

21
Q

Matching Law

A

You will do things in proportion to the amount of reinforcement that you receive from it.

22
Q

Vollmer & Bourret

A

An application of the matching law to evaluate the allocation of two- and three-point shots by college basketball players.

23
Q

Behavior Therapy

A

A range of treatments and techniques which are used to change an individual’s maladaptive responses to specific situations.

24
Q

Choice With Commitment

A

Can’t alternate back and forth between choices.

25
Q

Self Control

A

Preference for a large-delayed reward over a small immediate reward. ex) drug addiction and ADHD

26
Q

Stimulus Generalization

A

How well does a stimulus similar to the CS elicit a CR? ex) Marketing, Stereotypes, Sirens/Horns, Phone dings

27
Q

Stereotypes/Prejudice

A

An example of stimulus generalization - applying one experience to all similar experiences.

28
Q

Second-order (Higher order) conditioning

A

A stimulus that predicts the CS comes to elecit a CR.
ex) CS2->SC1->US

29
Q

What makes an effective Conditioned Stimulus and Unconditioned Stimulus?

A

Novelty, Intensity and Salience, Belongingness, and Biological strength.

30
Q

Stimulus Substitution (Jenkins & Moore)

A

The CS-US association turns the CS into a substitute US - thus US determines CR. ex) Pigeons had different CR’s depending on whether US was food or water, but CR is not always the same as UR.

31
Q

CS Determines CR (Timberlake & Grant)

A

Stimulus-Substitution is not always correct ex) A rat was used as a CS to predict food for another hungry rat.

32
Q

Extinction

A

Decreasing the strength of the CR. Takes longer than acquisition and is vulnerable to spontaneous recovery and disinhibition. Not unlearning, but learning something new

33
Q

Anxiety Disorders

A

Persistent distressing anxiety or maladaptive behaviors that reduce anxiety. ex) Phobias and OCD

34
Q

Systematic Desensitization (Treatment)

A

Presents the CS (feared object) multiple times without the US occurring. CR begins to diminish. Done gradually in attempt to extinguish the old CR of fear and replace it with a less dysfunctional CR.

35
Q

Flooding (Treatment)

A

Intense exposure to anxiety-triggering stimuli. ex) Fear Factor

36
Q

Aversive Conditioning (Treatment)

A

Replacing a positive response with a negative response. ex) Antabuse and alcohol (Antabuse makes you sick if you use alcohol after taking it)