Learning Flashcards

Chapter 8

You may prefer our related Brainscape-certified flashcards:
1
Q

What is the definition of learning?

A

Learning as a change in the mechanisms of behavior, not a change in behavior directly.

Mechanisms of behavior: the underlying machinery (neural system) that make behavior happen.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Briefly define the terms “learning” and “behavior.”

A

Learning: Relatively permanent change in an organism’s
behavior as a result of some type of experience.

Behavior: Any activity of the organism that can be either
directly or indirectly measured.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Why the learning‐performance distinction is important?

A

Behavior is determined by many factors in addition to
learning: e.g., motivation to respond.
Do not equate learning to change in behavior: easy to think that if there is no change in behavior, then there
is no learning.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the difference between classical and operant conditioning?

A

Classification of basic mechanisms of learning:

Pavlovian or classical conditioning: How we learn about the relationships between events that occur independently of our behavior.

Operant or instrumental conditioning: How we learn the
relationships between our actions (behavior) and their
consequences.

These relationships are the bases for all complex learning processes in an organism’s life.
These processes are thought to be common across many species and they do not require any sort of organized instruction.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are “elicited behaviors”?

A

Behavior which happens in response to some environmental event:
Stimulus ‐‐‐‐‐ (brings forth) ‐‐> Response
• Reflexes, Fixed/Modal Action patterns (innate)
•Response to single stimulus (Habituation and Sensitization)
•Learning paired events (Classical and Operant
Conditioning)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the simplest form of elicited behavior?

A

Elicited Behavior Occurs in response to a stimuli.
Simplest form of elicited behavior.
-reflexive behavior: simple, unlearned, automatic
-hard wired: we are born w the neural circuit (reflexes)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Define “reflex.” Briefly define the startle reflex, the orienting response, and the flexion reflex.

A

Relatively simple, involuntary response to a stimulus. Closely tied to survival.
Examples: startle response, orienting response, flexion
response

(next slides tell def)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Startle response

A
  • Defensive reaction to a sudden, unexpected stimulus.
  • Involves the automatic tightening of skeletal muscles as well as various hormonal and visceral (internal organ) changes.
  • Reaction designed to ready us for fight or flight if the unexpected stimulus should prove dangerous.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Orienting response

A

Relatively major body movement, such as when we automatically turn in response to an unfamiliar (or familiar) stimulus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Flexion response

A

We automatically jerk our hand or foot away from a hot or sharp object that we have inadvertently contacted.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

About the “reflex arc:”

a. Provide a brief definition.
b. What is the minimum number of neurons involved in a reflex? List their names.

A

It is a neural structure that underlies many reflexes and
consists of a sensory neuron, an interneuron, and a motor
neuron (through the spinal cord; simplest form of reflex: it
may not involve the brain). (e.g., flexion response)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are the three main characteristics of reflexes discussed in class?

A

Characteristics of reflexes:
• Highly stereotypic in form, frequency, developmental appearance, and strength.
• They may vary across individuals.
• The same reflex might be observed across species.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Define “Fixed Action Patterns (FAP)” (a.k.a. “Modal Action Patterns”).

A

Characteristics:
•Instinctive ‘fixed’ sequence of behaviors elicited by a
stimulus.
• Tend to be particular to one species.
• It involves most of the entire organism.
• Relatively variable across individuals.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the specific name for the stimulus that elicits a given FAP?

A

The eliciting stimulus is called sign or releaser.

Examples: Gulls’ egg retrieval behavior, and aggressive behavior against other males in the three‐spined
stickleback fish

Examples of reflexes and modal action patterns with humans: rooting or sucking reflex, Moro reflex,
grasping, head turning, grasping reflex, orienting response, startle response, flexion, eyebrow flash, gag
reflex, vomiting, yawning, smiling, crying …

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Define “habituation” and “sensitization.” Explain why these terms represent opposite types of behavior.

A

Habituation: A decrease in the strength of an elicited behavior due to repeated presentations of its
eliciting stimulus.

Sensitization: An increase in the strength of an elicited behavior due to repeated presentations of its
eliciting stimulus.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Define classical conditioning. Describe Pavlov’s basic experiment (using the metronome and meat powder).

A

Origins: Ivan P. Pavlov
• Russian, Nobel prize in Physiology.

  • Initially interested in the process of digestion (saliva and stomach secretions).
  • Procedure = Restrained dogs received meat powder and the salivary and stomach secretions were collected through a tube.
  • Problem: Dogs salivated in presence of researchers. Why? Acquired (learned) behavior?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Define US, UR, CS, and CR.

A

Pavlov explain the mechanism to acquire the new learning
by means of the classical conditioning procedure:

Prior to conditioning: NS no Response, US UR
During conditioning: NS (to become CS) : US UR (the NS becomes associated with the US; it
becomes CS)
After conditioning: If CS is presented by itself, it elicits responding (CR)

Definitions:
•US: Stimulus that naturally elicits a response (i.e., without
learning).
•UR: Response naturally elicited by the US (i.e., without
learning).
•CS: Initially neutral stimulus that comes to elicit a response (i.e., with learning).
•CR: Response that comes to be elicited by the CS (i.e., with learning).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the difference between “excitatory conditioning” and “inhibitory conditioning”?

A

• Contingencies can be:
- Positive: The CS and US occur together more often than
apart (Excitatory conditioning: responses being
produced).
- Negative: The CS and US occur together less often than
apart (Inhibitory conditioning: responses being
prevented).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What are the four excitatory conditioning procedures that result from different temporal arrangements of the CS and US?

what are the factors that determine the effectiveness of excitatory conditioning?

A

Delay conditioning, Trace conditioning, Simultaneous conditioning, and Backward conditioning

Factors that determine the effectiveness of excitatory conditioning
• Number of CS‐US pairings
More pairings (experiences) result in stronger CR.
But, first trials are more effective than later ones.
Response strength increases until it reaches an asymptote (no more increases in responding)
• Temporal arrangement of the CS and US
Defined in terms of WHEN the US is presented with respect to the CS.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Delay conditioning:

A

The CS is presented until the US is presented.
Procedure that produces the stronger conditioned
response.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Trace conditioning

A

The CS begins and ends some time before the US is
presented.
It does not produce a strong conditioned response. The
longer the interval between CS termination and US
presentation, the weaker the responding.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Simultaneous conditioning

A

The CS and US are presented and terminated at the same
time.
Very weak conditioned responding: No time for the
organism to anticipate the US and change its behavior
accordingly.

23
Q

Backward conditioning

A
  • CS immediately follows a US.
  • Weak responding with first pairings, inhibitory responding
    afterwards: With repeated pairings the CS become a safety signal for the non‐presentation in the near future of the US.
  • CR strength depends on anticipation of an impending US.
24
Q

What is “contiguity”? How can contiguity favor the development of conditioning?

A

• CS‐US contiguity
Defined in terms of HOW CLOSE the presentation of the US is to the presentation of the CS.

The closer 2 events are presented together, the stronger
you associate them (the better you learn them).

In general, conditioning (responding) is better if:
•The CS is presented a few moments before the US is delivered.
•There is a very short gap (better yet, no gap) between CS
termination and US onset.

25
Q

Define “contingency” and explain the difference between “positive” and “negative” contingencies.

A

CS‐US contingency
Defined in terms of HOW OFTEN the CS and US occur
together vs. apart.
Contingencies can be:
•Positive: The CS and US occur together more often than
apart (Excitatory conditioning: responses being produced).
•Negative: The CS and US occur together less often than apart (Inhibitory conditioning: responses being prevented).

26
Q

What is the relationship between number of trials and development of conditioning? Are all trials equal?

A

Factors that determine the effectiveness of excitatory
conditioning
• Number of CS‐US pairings
-More pairings (experiences) result in stronger CR.
-But, first trials are more effective than later ones.
-Response strength increases until it reaches an asymptote
(no more increases in responding)

27
Q

Define acquisition and extinction. What is the difference between them? Can you draw the shape of an acquisition curve and an extinction curve?

A

Acquisition of conditioned responses

Acquisition = Process of developing or strengthening a conditioned response.
First pairings elicit more responding than later ones.
Changes in responding gradually decrease until an
asymptotic level of responding has been reached.

Extinction of conditioned responses
Extinction = Process of weakening or eliminating a conditioned response.
First extinction pairings weaken responding more than later ones.
Changes in responding gradually decrease until no responding is observed.

28
Q

What is an asymptote?

A

asymptote (no more increases in responding)

29
Q

Why do we say that extinction is not the same as unlearning (weakening and breaking the connection between neurons)?

A

Extinction (= with a line/ ) unlearning
-Extinction is not the weakening (gradually erasing) of an
existing association
Memories of the learning experience remain after extinction

30
Q

Explain “rapid reacquisition” and “spontaneous recovery”. What do they tell us about extinction?

A

Evidence that learning remains after extinction:

• Rapid reacquisition:
Acquisition after extinction is much faster than if the CS had not been trained before.
Not new learning, but bringing back what was learned.

• Spontaneous recovery
Return of the CR after extinction due to passage of time.
If it returns, it could have not be erased (unlearned).
In extinction, new memory (CS‐noUS) competes (interferes) with old one (CS‐US).

31
Q

Why do we talk about specificity of classical conditioning?
Define “overshadowing” and “blocking”.
Make sure you can tell them apart!

A

In some situations, conditioning is not evident despite repeated pairings of the CS and US.
E.g., overshadowing: Situation in which a compound of
stimuli are experienced, and the more intense element of the compound overshadows (prevents conditioning) of the
less intense element.
E.g., blocking: Situation in which one of the elements of a
compound stimulus is a better predictor of the US than the other element of the compound, and the better predictor
blocks (prevents responding) to the other element.

What you learned about paired events depends on what
else you know about them.

32
Q

Define the phenomena of higher‐order conditioning and sensory preconditioning.
Make sure you can differentiate them!

A

Extensions of classical conditioning
• Higher‐order conditioning (e.g., second order conditioning):
A stimulus that is associated with a CS can also become a
CS.
• Sensory pre‐conditioning:
When one stimulus is conditioned as a CS, other stimuli
previously associated with it can also become CS.

33
Q

Describe Thorndike’s “law of effect”.

A

E. L. Thorndike’s “law of effect”

Learning progresses gradually from previous experiences,
by means of the “law of effect”.

The law of effect states that behaviors leading to a
satisfactory state of affairs are strengthened or
“stamped in,” while behaviors leading to an unsatisfactory or annoying state of affairs are weakened or “stamped out.”

The probability of a behavior occurring is a function of the
consequences that such behavior had in the past.

34
Q

What are the three components of operant conditioning? Define each one of them (you must recognize the abbreviations S^D, R, and S^R).

A
B. F. Skinner's three components of operant conditioning 
(S^D > R > S^R) 
-S^D= discriminative stimulus
-R= response
-S^R=reinforcing stimulus
35
Q

S^D= discriminative stimulus

A

•An antecedent or discriminative stimulus that precedes the response and informs about consequence availability. S^D

Discriminative stimuli can signal: 
• Reinforcement: Appetitive SR (pleasant) 
• Punishment : Aversive SR
(unpleasant) 
• Extinction: No S
36
Q

R= response

A

A behavior or response that produces a certain
consequence. R
Operant response = Class of emitted responses that result in a consequence. Response is goal oriented.

Operants are quantitative units of behavior defined in terms of their effect on the environment.

37
Q

S^R=reinforcing stimulus

A

The consequence or reinforcing stimulus that increases or decreases the probability of future occurrence of the
response. SR
Reinforcing stimuli are defined by their consequences on behavior, not by whether the subject likes them or not (neither by intention).
Consequences can be broadly classified as reinforcers or punishers:
Reinforcers- increase the probability that the behavior will occur in the future.
Punishers- decrease the probability that the behavior will occur in the future.
Operant behavior can also be extinguished (R no
SR)

38
Q

What is an “operant”?

A

relating to, or being a response that occurs spontaneously and is identified by its reinforcing or inhibiting effects.

39
Q

When it comes to SDs, what can they signal?

A

When it comes to SDs, what can they signal?
Discriminative stimuli can signal:
• Reinforcement: Appetitive SR (pleasant)
• Punishment : Aversive SR
(unpleasant)
• Extinction: No S

40
Q

What is the difference between reinforcers and punishers?

A

Reinforcers increase the probability that the behavior
will occur in the future.

Punishers decrease the probability that the behavior
will occur in the future.

41
Q

How does behavior differ when it is reinforced immediately vs. with a delay?

A

Immediate vs. delayed reinforcement
How contiguous are the response and consequence?
In general terms, the more immediate the reinforcer, the
greater its effect on behavior.

Why? Behavior is a continuum of responses = Long delays may result in reinforcement of the wrong
response.

42
Q

What is the difference between primary and secondary reinforcers?

A

Primary vs. secondary reinforcers
•Primary (unconditioned) reinforcers: Things that are innately reinforcing (without training).
E.g., getting food, water, attention, social interaction…

•Secondary (conditioned) reinforcers: Things that acquire
reinforcing value through learning.
E.g., getting money, fine clothing, good grades at school…

43
Q

When it comes to secondary reinforcers: How are they established?

A

Established through Pavlovian and operant conditioning

44
Q

What is shaping? What are the two principles on which shaping is based? Why is behavioral variability important for shaping to occur?

A

Gradual creation of new operant behavior through successive approximations to that behavior
(directing the subject to perform the response we require.)

For shaping to be effective, the subject must know that
reinforcement is available in the situation.
It is effective if the subject exhibits behavior variability.

Based on two principles:
• Reinforcement of successive approximations
• Nonreinforcement of early response forms

45
Q

What is chaining? How is chaining implemented? What is the difference between forward and backward chaining?

A

Training a subject to perform a response chain (connected sequence of behaviors).
A sequence of responses in which
• The result of each response (except for the last) is:
• a conditioned reinforcer for that response
• an SD for the next response

• The reinforcer for the last response maintains the
reinforcing effectiveness of the conditioned reinforcers in
the chain.
Task analysis: Breaking the chain into the component response elements

46
Q

What are behavioral contrast effects and why are they important?

A

Impact of reinforcers in behavior is not absolute but relative.
Behavioral contrast effects: Behavior changes if the amount of reinforcement changes over time.

-Behavioral contrast= negative and positive contrast effect

47
Q

Describe and give some example of negative and positive contrast effects.

A

If actual reinforcement is lesser than past reinforcement, individual’s responses decrease
(negative contrast effect).

If actual reinforcement is greater than past reinforcement, individual’s responses increase
(positive contrast effect).

48
Q

What is the anticipatory contrast effect?

A

If a cue signals that reinforcement rate will soon change

= Anticipatory contrast effect

49
Q

How can self‐control be defined operationally?

A

Self control: Being able to delay gratification. That is,
choosing the large, but delayed reward over the
small, but immediate reward.

50
Q

What is temporal discounting? How does it influence behavior? Give some examples.

A

Temporal discounting: Tendency to discount rewards as they approach a temporal horizon in the future or the past.

We discount rewards as they are far away in time.
We give greater value to rewards as they are closer in time.

51
Q

Describe the Stanford marshmallow experiment by Walter Mischel. Why is it important?

A

Stanford marshmallow experiment by Walter Mischel:
•A series of studies on delayed gratification in the late 1960s and early 1970s.
•A child was offered a choice between one small immediate reward or two small delayed (15 min.) rewards
• In follow up studies, the researchers found that children
who were able to wait longer for the preferred rewards
tended to have better life outcomes (SAT scores, jobs,
income,…)

52
Q

Appetitive conditioning and Aversive conditioning

A

The US motivates the organism to respond and show
whether and how learning occurred. The US can
be appetitive or aversive.

-Appetitive conditioning: The US is an event that organisms usually seek out.

-Aversive conditioning: The US is an event that organisms
usually avoids.

53
Q

Appetitive procedures and Aversive procedures

A

Appetitive procedures: Sign tracking (animals approach and make contact with stimuli that are related to food), head entries, and lever presses.

Aversive procedures: Conditioned emotional response
(CER) procedure, eyeblink conditioning