Learning Flashcards

(68 cards)

1
Q

Instinctual behaviours

A

Examples: imprinting, homing, migratory behaviours, etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Pavlov’s experiment 1890s

A

NS –> no response
US –> UR
Repeatedly pair NS and US –> UR
Results: CS –> CR

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Reflexive behaviours

A

Examples: eye-blinking, ‘sucking’ and ‘gripping’ in babies; some reflexive behaviours may disappear as you grow older

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Habituation

A

Decline in the tendency to respond to stimuli that have become familiar due to repeated exposure; ensures that benign stimuli do not interrupt our activity or cause us to expend unnecessary energy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Classical conditioning

A

A neutral stimulus is repeatedly paired with a stimulus that automatically elicits a particular response –> previously neutral stimulus becomes a conditioned stimulus that also elicits a similar response; classical conditioning is not so much the replacement of the US by the CS, but a learning mechanism where the CS (and the CR) prepare the animal for the onset of the US and UR

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Watson & Raynor 1920: Little Albert experiment

A

Conditioned fear (fear associated with certain stimuli)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Fetishes

A

A person has heightened sexual arousal in the presence of certain inanimate objects, with the object becoming a conditioned stimulus that can elicit arousal on its own; evidence that it is due to classical conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Is classical conditioning the replacement of the US by the CS?

A

No, classical conditioning is a learning mechanism where the CS (and the CR) prepare the animal for the onset of the US and UR

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Edwards & Acker 1972

A

Found that WWII veterans reacted to sounds of battle even 15 years after the war

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Compensatory reaction hypothesis

A

Sometimes, the UR and the CR can be opposites: insulin injections (US) deplete blood sugar levels (UR), and after a number of such injections, the body reacts to the CS in an opposite way to how it reacts to the US (blood sugar levels go up as the body ‘prepares’ itself for the injection); could lead to drug overdose if the CS is not present when drug is taken/ administered

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Siegel 1989

A

Tested the tolerance of rats for ‘overdoses’ of heroin in novel or usual environments –> rats more likely to overdose if they were given drug in new as opposed to usual environment where they had drugs (evidence for compensatory reaction hypothesis)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Acquisition

A

Process by which a conditioned stimulus comes to produce a conditioned response

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Trace/ forward conditioning

A

CS comes before US, but there is a gap between them; not as effective as delayed forward conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Simultaneous conditioning

A

CS and US start and end together; often fails to produce a CR

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Backwards conditioning

A

CS begins after US; least effective form of classical conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Delayed forward conditioning

A

Conditioned stimulus comes just before/ overlaps with the unconditioned stimulus; most effective form of classical conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Contingency

A

How good of a predictor your conditioned stimulus is for the unconditioned stimulus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Contiguity

A

Learning occurs due to temporal proximity of CS and US

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Extinction

A

If you present the conditioned stimulus without ever presenting the unconditioned stimulus, then the CR would gradually decrease; rate of decrease depends on factors such as initial response strength

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Spontaneous recovery

A

A CS-CR relation is extinguished, however, after a period with no CS presentations, the CS may elicit the CR again

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Flooding therapy

A

Fear elicited by a CS (certain phobias) is eliminated by the process of extinction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Stimulus generalisation

A

A conditioned response formed to one conditioned stimulus will occur to other, similar stimuli

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Generalisation gradients

A

Stimuli closer to the CS produce greater CRs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Stimulus discrimination

A

Occurs when an organism does not respond to stimuli that are similar to the stimulus used in training

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Discrimination training
Organism is reinforced for responses to one stimulus and not the other --> if organism learns to discriminate, it will respond more to the reinforced stimulus
26
Systematic desensitisation
Takes a similar stimulus that produces a lesser reaction and repeatedly exposes the subject to it → generalising extinction rather than the acquisition of a response
27
Blocking
Conditioning does not occur if a good predictor of the US already exists
28
Higher-order conditioning
Once a stimulus has become an effective CS for a certain CR, then that stimulus can be used to condition other stimuli; effect begins to diminish after a number of trials
29
Sensory preconditioning
Learning occurs in the absence of UR; classical conditioning reveals the association already learned between two events
30
Taste-aversion learning
When an individual avoids a certain food or drink due to prior illness/ bad experience with it (substance makes them nauseous)
31
Thorndike 1874-1935: Law of Effect
Law of Effect: Positive consequences increased the likelihood or probability of a response; behaviours are ‘stamped out’ if followed by negative consequences
32
Instrumental conditioning
Concerns the probability/ likelihood of a response changing as a result of its consequences; the subject emits the response in order to produce a reward
33
Skinner 1904-1990
Skinner’s version of the Law of Effect: When a response is followed by a reinforcer, the strength of the response increases, and when a response is followed by a punisher, the strength of the response decreases
34
Positive reinforcement
Adding a stimulus or event contingent upon a response increases that behaviour
35
Negative reinforcement
Removing a stimulus or event contingent upon a response increases that behaviour
36
Positive punishment
Adding a stimulus or event contingent upon a response decreases that behaviour
37
Negative punishment
Removing a stimulus or event contingent upon a response decreases that behaviour
38
Cumulative record
Graph of responses from a conditioning experiment
39
Continuous reinforcement
Every instance of a response is reinforced; useful for shaping behaviour
40
Partial or intermittent reinforcement
A designated response is reinforced only some of the time (useful for maintaining behaviours); less likely to undergo extinction because individuals are used to the unreliable nature of the reinforcement
41
Ratio schedule
Depends on the number of responses, fixed and variable
42
Interval schedule
Response is still important, but what determines whether or not a response will be reinforced depends on the passage of time
43
Fixed-ratio schedule
Reinforcer is given after a fixed number of non-reinforced responses
44
Variable-ratio schedule
Reinforcer is given after a variable number of non-reinforced responses; the number of non-reinforced responses varies around a predetermined average
45
Fixed-interval schedule
Reinforcer is given for the first response after a fixed period of time has elapsed
46
Variable-interval schedule
Reinforcer is given for the first response after a variable time interval has elapsed; interval lengths vary around a predetermined average
47
Gaetani et al. 1986: Engineering compensation systems (effects of commissioned vs. wage payment)
Found that in this particular case, ratio reinforcement (commission-based pay) was more effective than interval reinforcement (hourly pay)
48
Partial-reinforcement extinction effect
Partial reinforcement schedules provide greater resistance to extinction
49
Side-effects of extinction:
1. Increase in response rate before it goes down 2. Increase in response topography (participant takes a different approach in the hopes of being reinforced); example: extinction-induced aggression
50
What led to the development of Premack’s Principle?
Didn't take into account how the significance of stimuli can change depending on context (Premack took issue with the idea of trans-situational reinforcers - reinforcers that have the same impact regardless of the situation)
51
Premack's Principle
Behaviours are either high probability or low probability; behaviour is reinforced when it is followed by higher probability behaviours
52
Mitchell & Stoffelmayr 1973
Demonstrated Premack's Principle with schizophrenic individuals
53
Honig & Slivka 1964
Found that the effects of punishment are easily generalised
54
Reynolds 1969
Found that one stimulus dimension had overshadowed learning of the other stimulus dimension
55
Herrnstein & DeVilliers 1980
Found that non-human animals could form categories or concepts from complex stimuli
56
The Kelloggs 1933
Raised the chimpanzee Gua alongside their son; Gua learned to understand some commands, but never any English words
57
The Hayes 1951
Raised Vicki; she learned to make three ‘recognisable’ words (papa, mama, and cup)
58
1960s attempts at teaching sign language to chimpanzees
Communication possible, but little evidence of syntax
59
Communication between humans and non-human primates (symbols)
Rumbaugh developed 'Yerkish’ language for non-human primates; pygmy chimpanzee/ bonobo was tested and results were compared to those of a two-year-old human - had the ability to understand sentences compared to human children’s performance (skill of a two or 2.5-year-old child)
60
Breland & Breland 1961
Demonstrated biological constraints on instrumental conditioning (pig and raccoon)
61
Tolman & Honzik 1930
Found that rats actively process information rather than operating on a stimulus-response relationship; latent learning
62
Observational learning
Occurs when an organism's response is influenced by the observation of others’ behaviour (models)
63
Palameta & Lefebvre 1985
Found that group exposed to observational learning was quicker to display the same behaviour (in this case, eating the seed)
64
Cook & Mineka 1987
Fear of snakes learnt by observation, but also biological constraints - no fear of flowers learned
65
Bandura et al. (1963, 1965)
Aggression learned through modelling (observational learning)
66
Bandura et al. 1967
Showing boy playing fearlessly with dog helped to reduce fear (observational learning)
67
Bandura (four key processes for observational learning):
1. Attention: extent to which we focus on others’ behaviour 2. Retention: retaining a representation of others’ behaviour 3. Production: ability to actually perform actions we observe 4. Motivation: need to perform actions we witness (usefulness)
68
Poche et al. 1988
Observational learning was effective in teaching abduction-prevention skills to children, but not as effective as observational learning paired with rehearsal