Lectures 10-11 Learning and conditioning Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

What is learning? (definition)

A

Change in behavior due to experience

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

List the different types of learning. Which are simple?

A
  • Habituation
  • Sensitization
  • Conditioning
    –> Classical
    –> Operant
  • Observational
  • Latent
  • Insight

Simplest: habituation, sensitization

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Habituation definition

A
  • reduction in response to stimuli when presented repeatedly / frequently
  • ex. your roommate plays video games at night. You can’t sleep the first week. After that, you get used to it, it isn’t as much of a problem, and you can sleep
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Sensitization definition? Associated with:.

A
  • Increase in response to stimuli
  • A small thing that when repeated results in a huge response
  • Ex. clearing throat is annoying, the more you hear it the more it pisses you off
  • Associated with PTSD and depression
    –> PTSD: explosion had a loud sound. Now, they have an excessive/not appropriate response to a loud sound even if there isn’t danger associated with it
    –> Depression: much more sensitive to stress, raises cortisol, creates a bigger reaction
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Based on the Aplysia studies, what is the biological basis of habituation?

A

ex.
1. if you touch the siphon of the aplysia, the gill withdraws/contracts
2. repeated same touch every minute for 10-15 minutes
3. progressively shorter withdrawal/contraction durations
Biology:
- cells release less glutamate
- neurons choose not to interact (prune synapses)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Based on the Aplysia studies, what is the biological basis of sensitization?

A
  • interneurons release serotonin, causes motor neurons to release glutamate
  • they release more glutamate = stronger response
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Define classical conditioning

A

Creating an association between stimuli such that one stimulus results in the same reaction as the other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Pavlov?

A
  • developed method for measuring salivation. Won nobel prize in medicine and physiology for research on digestion
  • used metronome as a neutral stimulus for food and salivation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are the 4 stages (basic principles) of classical conditioning?

A
  1. aquisition
  2. generalization
  3. higher order conditioning
  4. extinction
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Acquisition?

A
  • process of associating the neutral stimulus with the unconditioned stimulus and unconditioned response
  • creates the conditioned (learned) stimulus with the conditioned response
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Generalization?

A
  • Organism responds not only to the original CS, but also to similar stimuli
  • Response gets smaller the farther away from the original CS
  • Generalization vs. discrimination: the point where the animal discriminates between a generalized stimulus and other, similar stimuli
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Higher order conditioning?

A
  • association of multiple things, further away from the UCS (and UCR)
    ex.
    1st: sound and food
    2nd: shape and sound and food
    3rd: light and shape and sound and food

Phases:
1: association between sound and food
2: association between shape and sound (without food)
Test: association between shape and food

  • extremely common with commercials
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Extinction

A
  • continued presentation of CS without UCS
  • conditioned response fades and eventually disappears
  • spontaneous recovery: suddenly ‘remember’ and have CR again
  • Extinction is NOT forgetting
  • rapid reacquisition
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

In Little Albert’s experiment identify UCS, CS, NS, UCR, and CR?

A

NS - white rat
UCS - loud noise
UCR - fear (of loud noise)
CS - white rat
CR - fear (of animal)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

How was the principle of generalization shown in little Albert experiment?

A

he became afraid not just of the white rat, but of other animals, like cats, bunnies, dogs, and even the mask

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What are the learning sequences?

A
  1. forward learning (normal)
    a. NS/CS is still present when UCS appears
    b. easiest to learn
  2. forward trace pairing
    a. CS first, but is not present when UCS appears
    b. longer the gap, the harder it is to learn
    c. example: quack/nerf video from class
  3. simultaneous pairing
    a. CS and UCS appear at the same time
    b. still learn, but it takes many many more trials
    c. If dog gets sound and food at the same time,
    they are salivating to the food, not the sound
  4. backward pairing
    a. UCS is presented before CS
    b. most difficult to learn
    c. hardest to measure (dog is already salivating)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Is it easier to learn an association with forward leaning or backward learning? Explain
your answer

A

forward learning!
CS is present when UCS appears. They anticipate the UCS when they’re exposed to CS. With backward pairing, the CS is not as important, because they have the response to the UCS first

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Extinction of a behavior is different from forgetting a behavior. Give an experimental
example showing that extinction is different from forgetting

A
  • pavlov’s dogs had an association between sound and food (therefore salivation)
  • if the sound were repeatedly presented without the food, they would eventually stop reacting to the sound
  • later, if the sound is followed with the food again, the dog will rapidly reacquire the association and CR
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is second order conditioning? Give an example of a practical use of second order
conditioning

A
  • the association between more than 2 stimuli.
  • requires an association between the original NS and UCS
  • once the NS becomes CS, build a new connection between CS1 and CS2
  • then, build connection between CS2 and UCS
  • ex. commercials
    –> associate being with friends & music with being happy
    –> associate being with friends & music with drinking White Claws
    –> associate drinking White Claws with being happy
20
Q

Explain how we are conditioned to respond to the notifications from our phones? What
type of conditioning is it?

A

It is a type of classical conditioning. We’ve been trained to associate the sound/tone from our phone with receiving a message. Because the tone comes before we see the message, it’s forward learning.

21
Q

How can phobias be explained through classical conditioning? Use fear of cats to explain your answer

A

We learn to associate a neutral stimulus (a cat) with an unconditioned stimulus. For example, if you’ve been bitten or scratched by a cat, you associate the cat (NS) with the scratch (UCS) and the fear (UCR).

22
Q

Describe the cellular mechanism of habituation in the aplysia

A
  • the cells release less glutamate NTs, which makes the response weaker
  • the neurons choose not to interact (pruning of the synapses) so there is no transmission and much less glutamate released
23
Q

Imagine a toxin blocking the glutamate receptors, what form of learning would be more affected sensitization or habituation? Explain your answer.

A
  • Sensitization would be more affected.
  • With habituation, the glutamate transmission is already being reduced, so the toxin blocking the receptors would not have much affect.
  • Sensitization is increased response due to increased release of glutamate
  • If the glutamate was not being received due to the toxin, it would be very difficult to learn by sensitization
24
Q

Operant conditioning definition? Who started it?

A

We learn due to the consequences of our actions
(Originally Thorndike, continued by Skinner)

25
Q

What was Thorndike’s experiment? with what animal?

A

He designed puzzle boxes. He placed cats inside his puzzle boxes, and they had to figure out how to escape to get their reward (food)

26
Q

Based on Thorndike’s experiments, do animals have insight? How did he show it (or not show it)? (hint –> Pay attention to the graph)

A

No, animals do not have insight.
- Insight means learning by an ‘aha’ moment. NOT trial and error.
- Cats learned only by trial and error. The graph shows a single cat and the amount of time it took for it to get out of the puzzle box each round. If it learned by insight, the graph would drop dramatically after the ‘aha’ moment and stay low. While it did decrease some, it still had to figure it out each time

27
Q

Thorndikes law of effect? Importance? how did skinner modify it?

A

Thorndike:
- A response followed by satisfying consequences will become more likely to occur
- A response followed by an annoying consequence will be less likely to occur

Skinner:
- stopped using ‘satisfying’ and ‘annoying’ because there wasn’t a good OD
- instead, used ‘reinforcement’ and ‘punishment’
- says reinforcement always works better than punishment to modify behavior

28
Q

What are the schedules of reinforcement?

A
  • Fixed ratio (FR)
  • Variable ratio (VR)
  • Fixed interval (FI)
  • Variable interval (VI)
29
Q

fixed ratio? Example?

A

Reinforcement after every X number of responses
- most reward/loyalty programs

30
Q

variable ratio? Example?

A

Reinforcement after an *average number of responses
- lottery/slot machines

31
Q

fixed interval? Example?

A

Reinforcement after X amount of time
- salary every 2 weeks
- wait until the last minute to study (exams are every 4 weeks)

32
Q

variable interval? Example?

A

Reinforcement after an average amount of time
- pop quiz or attendance sheet
- approx. every 2 weeks or approx. every 3 classes

33
Q

Which schedule is the most effective?

A

Variable ratio

34
Q

What type of schedule of reinforcement is used when you check your phone to see if
you have a message? Explain your answer

A
  • variable interval
  • you can assume that approx. every 5 minutes you’ll get a message, but not exactly.
  • variable ratio would be if you got a message every 5th time you check your phone, which isn’t the case
35
Q

What do “positive / negative / reinforcement / punishment” mean in operant conditioning?

A

Positive:
- something is added after response
Negative:
- something is removed after response
Reinforcement:
- increase likelihood of repeating behavior
Punishment:
- decrease likelihood of repeating behavior

36
Q

Explain the difference between negative reinforcement and positive punishment. Give
an example of each explaining why is positive or negative, and reinforcement or
punishment.

A

Negative reinforcement:
–> something is taken away to encourage behavior
–> ex. a hard working employee gets a day off of work (work is removed) to encourage him to keep working hard

Positive punishment:
–> something is added to discourage behavior
–> ex. an athlete is late to practice, coach makes him run 3 laps. (laps are added to discourage being late)

37
Q

How the use of an umbrella when is raining can be described as a positive or as a
negative reinforcer?

A

Positive: added protection from the rain
Negative: removed wetness, wet clothes, etc.

38
Q

What type of schedule of reinforcement is used when fishing?

A

Variable interval
You get the reward (a fish) after an approximate amount of time, but the time changes. It has nothing to do with how many times you repeat the behavior (casting the line or net)

39
Q

What type of schedule of reinforcement is used when driving an UBER car?

A

Fixed ratio
FR-1
You get the reward (payment) every time you do a response (complete a trip)

40
Q

Define observational learning

A
  • occurs when a person observes and imitates behavior
  • extremely common in the real world
  • ex. teaching a 15 year old to drive cannot be done with conditioning or trial and error
  • most complex behavior learned by exposure
41
Q

Processes of observational learning?

A
  1. attention
  2. retention
  3. motor reproduction
  4. reward
    –> vicarious reinforcement
    –> vicarious punishment
    –> seeing the consequences for the model
42
Q

What type of learning is shown in Bandura’s experiments? What are the effects of watching/playing violent video games. Explain your answer

A

Observational learning

  • watching/playing violent video games leads children to be violent
  • bandura’s experiment involved children watching adults be violent towards a doll. Once the children were left alone, they repeated the violent behavior they witnessed
  • this suggests that watching/playing violent games will lead children to be violent/aggressive
43
Q

Define latent learning and example

A
  • Learning that is not reinforced and not demonstrated until there is motivation to do so
  • demonstrated by Tolman’s maze

3 groups of rats:
1. got no food
2. got food
3. didn’t get food until day 11
Conclusion:
- rats did learn, they just didn’t have a reason to show it
- ultimately were faster than those who got the food from the beginning

44
Q

Define insight learning

A
  • form of problem solving, achieved through cognitive processes
  • not trial and error
  • experimented with chimps. they stacked boxes to reach a banana places out of reach
45
Q

What did the experiments of Köhler and Tolman added to the different theories of learning?

A
  • contributed a lot to the ideas of cognitive learning
  • involves intentional thought on the part of the animal
  • Kohler –> had to problem solve, cognitive processes
  • Tolman –> they learned, and retained the knowledge. they were selective on when they applied their knowledge