Ch.6 (7) Conditioning & Learning Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Define Reinforcement

A

Some procedure that will increase the likelihood of a behavior occurring

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Name Two Ways of Learning

A

Classical Conditioning and Operant Conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Define Classical Conditioning

A

Interested in what happens before a response is made, the associations made

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Who was Classical Conditioning created by?

A

Pavlov

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Define an Unconditioned Stimulus

A

Will lead to an unconditioned response (reflex reaction)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Define an Unconditioned Response

A

An unlearned stimulus (ex. Pavlov’s experiment, the salivation due to the meat)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Define a Conditioned Stimulus

A

Leads to a conditioned response (ex. Pavlov’s experiment, THE BELL, along with showing the meat/food- leading to a conditioned response)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Define a Conditioned Response by example (Pavlov’s experiment)

A

The salivating after hearing the bell

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

State an exception/necessitiy for a Conditioned Stimulus

A

The stimulus must be presented first or right before the unconditioned stimulus, no more than 45 seconds

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Define Contiguity in both actual and in the definition of Classical Conditioning

A

Defined as the nearness in time and space meaning the CS (Conditioned Stimulus) has to occur right before the US (Unconditioned Stimulus)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Define Higher Order Learning

A

When you introduce another CS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Define Extinction

A

When the behavior becomes extinct, or stops.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is done to create Extinction?

A

The US has to be withheld

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Define Spontaneous Recovery

A

After a brief extinction period, when you reintroduce the CS leading to a CR (Conditioned Response)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Define Stimulus Generalization

A

If there’s a stimuli that’s similar, there will be a (the same) response

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Define Stimulus Discrimination

A

Refers to the ability to tell the difference between the (range of) stimuli

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Define Vicarious Learning

A

Learning through the actions of others

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Define Operant Conditioning

A

Interested in what happens AFTER a response is made

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Explain the difference between Classical Conditioning and Operant Conditioning

A

Classical conditioning is interested in what happens BEFORE a response, and Operant Conditioning is interested in what happens AFTER a response is made

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Define Positive Reinforcement

A

When you provide this, you are providing something that is appealing to another/others

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Define Negative Reinforcement

A

When you/someone provides this, it will increase the likelihood of a behavior occurring by avoiding something unpleasant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Provide an example of Positive Reinforcement

A

Giving candy or money if/when you participate in class

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Provide an example of Negative Reinforcement

A

Taking points off or staying longer in class if/when not participating in class

24
Q

Define Punishment

A

When you provide this, you are decreasing the likelihood of a behavior occurring

25
Q

Provide an example of Punishment

A

Timeout for kids

26
Q

What is the difference between Negative Reinforcement and Punishment

A

Unlike Punishment, in Negative Reinforcement you have the ability to turn things around, avoid unpleasant, and increase a type of behavior

Unlike Negative Reinforcement, in Punishment there is no turning things around or changing things, and/or no reward

27
Q

Provide an example of Shaping

A

Potting training kids (with praise)

28
Q

What are the types of Schedules of Reinforcement

A
  1. Continuous Reinforcement Schedule

2. Partial Reinforcement Schedule

29
Q

Define Continuous Reinforcement Schedule

A

The reward follows every correct response

30
Q

Define Partial Reinforcement Schedule

A

when the reward does not follow every correct response

31
Q

What is a characteristic/difference in Partial Reinforcement Schedule in comparison to Continuous Reinforcement Schedule?

A

Partial Reinforcement Schedule is more resistant in extinction (or harder to stop) than Continuous Reinforcement Schedule

32
Q

Name the different types of Partial Reinforcement Schedule

A
  1. Fixed Ratio
  2. Variable Ratio
  3. Fixed Interval
  4. Variable Interval
33
Q

Define a Fixed Ratio

A

The ratio of rewards to responses is set/fixed

For example, every 8th time.. every 3rd, 5th time you will get a reward

34
Q

Define Shaping

A

When you shape a behavior, you reward the behavior in small, incremental steps

35
Q

Provide an example of Shaping

A

Potting training kids (with praise)

36
Q

What are the types of Schedules of Reinforcement

A
  1. Continuous Reinforcement Schedule

2. Partial Reinforcement Schedule

37
Q

Define Continuous Reinforcement Schedule

A

The reward follows every correct response

38
Q

Define Partial Reinforcement Schedule

A

When the reward does not follow every correct response

39
Q

What is a characteristic/difference in Partial Reinforcement Schedule in comparison to Continuous Reinforcement Schedule?

A

Partial Reinforcement Schedule is more resistant in extinction (or harder to stop) than Continuous Reinforcement Schedule

40
Q

Name the different types of Partial Reinforcement Schedule

A
  1. Fixed Ratio
  2. Variable Ratio
  3. Fixed Interval
  4. Variable Interval
41
Q

Define a Fixed Ratio

A

The ratio of rewards to responses is set/fixed

For example, every 8th time.. every 3rd, 5th time you will get a reward

42
Q

Define a Variable Ratio

A

It is NOT set/fixed
Variable means varies
(“the average, approximately”)

43
Q

Define Fixed Interval

A

The ratio of rewards to TIME is set/fixed
Example - payday
Interval means passage of time

44
Q

Define Variable Interval

A

The ratio of rewards to TIME is NOT set/fixed

Interval means passage of time

45
Q

Define Shaping

A

when you shape a behavior, you reward the behavior in small, incremental steps

45
Q

Define a Fixed Ratio

A

the ratio of rewards to responses is set/fixed

For example, every 8th time.. every 3rd, 5th time you will get a reward

45
Q

Provide an example of Shaping

A

Potting training kids (with praise)

45
Q

What are the types of Schedules of Reinforcement

A
  1. Continuous Reinforcement Schedule

2. Partial Reinforcement Schedule

45
Q

Define a Fixed Ratio

A

The ratio of rewards to responses is set/fixed

For example, every 8th time.. every 3rd, 5th time you will get a reward

45
Q

Define Fixed Interval

A

The ratio of rewards to TIME is set/fixed
Example - payday
Interval means passage of time

45
Q

What is a characteristic/difference in Partial Reinforcement Schedule in comparison to Continuous Reinforcement Schedule?

A

Partial Reinforcement Schedule is more resistant in extinction (or harder to stop) than Continuous Reinforcement Schedule

45
Q

Name the different types of Partial Reinforcement Schedule

A
  1. Fixed Ratio
  2. Variable Ratio
  3. Fixed Interval
  4. Variable Interval
52
Q

Define Observational/Vicarious Learning

A

When you learn through the actions of others

53
Q

Define Superstitious Behavior

A

A behavior repeated because it seem to produce reinforcement, even though it is actually unnecessary

54
Q

Define Schedule of Reinforcement

A

A rule or plan for determining which responses will be reinforced