Learning and Motivation Flashcards

1
Q

What is classical conditioning?

A

A type of learned response that occurs when a neutral object comes to elicit a reflexive response when it is associated with a stimulus that already produces that response.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Describe Pavlov’s experiments on the physiology of dogs, and how he came to discover classical conditioning through this research.

A

He noticed the dogs began salivating when they saw the assistants, even before the food. Same response to white lab coat. White lab coat had become associated with food and could elicit salivation; an irrelevant stimulus;

Other irrelevant stimuli tested: bell ringing, metronome, lights, other sounds;

Theory of conditioned reflexes; certain behaviours could be modified so that they are elicited by certain stimuli.

Aim: participant to respond to a neutral/irrelevant stimulus in the same way they would respond to a natural stimulus; want a neutral stimulus to produce an innate response;

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Describe a typical Pavlovian experiment.

A

A neutral stimulus (ringing bell) presented together with stimulus that reliably produces reflex (i.e. food). Conditioning trial is repeated many times. Then on critical trial, neutral stimulus presented alone and conditioned response is measured.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is an unconditioned response (UR)?

A

Innate response, elicited by naturally occurring stimuli.

E.g. salivation - elicited by US, blinking, jumping/fear, stress, arousal, nausea.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is an unconditioned stimulus (US)?

A

Naturally occurring stimuli that elicits a response without any prior learning.

E.g. food, puff of air, loud noise, studying, sexual image, chemotherapy.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a conditioned response (CR)?

A

An acquired response that is learned.

E.g. salivation – elicited by CS.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is a conditioned stimulus (CS)?

A

Neutral stimuli. One that only elicits a response after learning takes place.

E.g. lab coat (or bell)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is second order conditioning?

A

Where a new second order conditioned stimulus is created, by pairing a new stimulus with a previously created conditioned stimulus. Once a CS has acquired a conditioned response, it can also act as if 1it is a US itself.

Example: the bell (CS1) is used to form CS2 – light –> elicits UR. Or celebrity endorsement – that transfer of positive attitudes towards celebrity to product.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Define acquisition.

A

The gradual formation of an association between the conditioned and unconditioned stimuli as a result of repeated presentations of the CS and the US together.

(Note: the CS before the US – bc principle of temporal contiguity – the strongest conditioning occurs when the CS comes ~500 milliseconds before the US.)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Define extinction.

A

Extinction is the removal of a conditioned response by consistently and repeatedly presenting the CR alone. The CR weakens because of the absence of an association between CS and US.

i.e. just ringing the bell alone –> unlearning.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Define spontaneous recovery. What does this suggest about extinction?

A

A process in which a previously extinguished response reemerges following presentation of the CS.

This suggests that extinction inhibits but does not break the associative bond. We know that it is not simply ‘unlearning’ – something more complex. People can relapse.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How does classical conditioning relate to human psychology more generally?

A

Classical conditioning is an experimental model for studying learning processes.

It is a type of ‘associative learning’.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is an example that shows that classical conditioning is more complex than just transferring a reflex from one stimulus to another?

A

The CR is not always the same as the UR. (CR = UR = salivation for Pavlov.)

Conditioning of fear and anxiety (emotional states) – done in rats by warning signal + electric shock through floor. The CR and UR are very different.
US – painful/unpleasant event: electric shock
UR – escape behaviour; defensive/aggression
CS – warning signal
CR – fear – freezes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Examples of appetitive conditioning?

A

• Food preferences
• Place preferences
Conditioning with good outcomes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Examples of aversive conditioning?

A
  • Conditioned fear
  • Anticipatory nausea
  • Conditioned taste aversions
  • Place avoidance
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is anticipatory nausea?

A

Chemotherapy patients – where nausea transfers to stimuli associated with the chemo such as talking to the doctor on the phone; walking into the hospital; the hospital room; the nurse;

  • US – chemotherapy nausea
  • CS – nurse / hospital room /
  • CR – nausea
  • UR – nausea
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

How does conditioning work in advertising?

A

Develop preference (UR/CR) for brand/product by pairing the product (CS) with desirable qualities (US).

We develop these preferences, fail to realise that these emotional responses come from the conditioning.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the relationship between exposure therapy and extinction?

A

Assumption that fear is a learned/conditioned response. Therefore, it can be extinguished through exposing people to the fear stimulus without any negative consequences. Over time, diminished fear (hopefully).

Two ways:
• Systematic desensitivisation – showing images
• Flooring – very real, intense exposure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is instrumental conditioning?

A

Instrumental conditioning — aka operant conditioning — learning process where consequences of action determine likelihood that that behaviour will be repeated in the future.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What is the most important difference between classical conditioning and instrumental conditioning?

A

Only in instrumental conditioning can the subject’s actions control how events in the experiment occur.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What is Thorndike’s Law of Effect?

A

Basic assumption: what a human does is influenced by the immediate consequences of such behaviour in the past.

Thorndike’s (1911) Law of Effect:
Given a particular situation, if an action is met with <b>satisfaction</b>, the organism will be more likely to make the same action next time it finds itself in that situation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What is radical behaviourism? Who were the best known advocates?

A

Started with J.B. Watson, continued by B.F. Skinner.

  • Rejected anything unobservable (Introspection is shit!)
  • i.e. not interested in things you cannot see (cognition, thoughts, any internal processes that can be inferred from behaviours etc)
  • Believed that all human psychology could be reduced to relationships between stimuli and responses.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What is radical behaviourism? Who were the best known advocates?

A

Started with J.B. Watson, continued by B.F. Skinner.

  • Rejected anything unobservable (Introspection is shit!)
  • i.e. not interested in things you cannot see (cognition, thoughts, any internal processes that can be inferred from behaviours etc)
  • Believed that all human psychology could be reduced to relationships between stimuli and responses.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What are reinforcers?

A

Events (stimulus following response) that result in an increase in likelihood of a particular behaviour. E.g. cat pressing pedal to get food – satisfaction.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

What is the difference between primary reinforcers vs. secondary reinforcers and social reinforcement?

A

Primary reinforcers = intrinsically valued; innately reinforcing; they satisfy biological needs; e.g. giving a dog food.

Secondary reinforcers = do not directly satisfy biological needs; acquired their reinforcing properties through experience; e.g. clicker with dog, money.

Social reinforcement = e.g. praise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

What is shaping?

A

A process of operant conditioning that involves reinforcing behaviours that are increasingly similar to desired behaviour. Gradual process of reinforcing successive approximations like a pigeon turning around, rats pressing bar, dog opening door.

Also undesirable progressively worse behaviours can also be shaped. E.g. giving into a child throwing a tantrum.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

What is the difference between reinforcement and punishment?

A
  • Punishment – decreases the likelihood that the response/behaviour will be repeated.
  • Reinforcement – increases the likelihood that the response/behaviour will be repeated.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

What is positive reinforcement? Give an example.

A
  • Positive reinforcement – providing a pleasurable stimulus (reward) to increase the probability of a behaviour being repeated.
  • E.g. giving a dog treats when it sits.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

What is negative reinforcement? Give an example.

A
  • Negative reinforcement – aka escape/avoid – removing an aversive stimulus to increase the probability of a behaviour being repeated.
  • E.g. nagging until they finally do the dishes.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

What is positive punishment? Give an example.

A
  • Positive punishment – giving a stimulus that decreases probability of a behaviour reccuring.
  • E.g. reprimand dog for doing bad thing.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

What is negative punishment? Give an example.

A
  • Negative punishment – aka omission – removal of a stimulus that decreases the probability of a behaviour reccuring.
  • E.g. no pocket money because you did this bad thing.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Define escape and avoid using the rat example.

A

Shuttle box, divided by barrier, half-grid floor, otherside = safe.
• Escape: Warning signal —> electric shock —> rat ESCAPES by jumping over to safe side.
• Avoidance: Warning signal —> jumps over to safe side —> sustained satisfaction, AVOIDS electric shock.
• Escape – turning off currently occurring aversive event.
• Avoidance – prevent aversive event from occurring.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

What are the different reinforcement schedules?

A
  • Fixed ratio
  • Variable ratio
  • Fixed interval
  • Variable interval
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

What is a fixed ratio schedule? Provide an example.

A

Reinforcement occurs every N responses.
• i.e. knows that they will be ‘rewarded’ after a certain number of times.

(e.g. piecemeal work – knowing they will get $10 after making 5 garments)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

What is a variable ratio schedule? Provide an example.

A

Reinforcer occurs on average every N responses.
• i.e. don’t know when or after how many responses exactly they will be ‘rewarded’.

(e. g. sales – each door knock – don’t know whether or not they will succeed in making a sale)
(e. g. gambling owners — give small amount of money often to get constant responses/playing/gambling but at unpredictable rate)

36
Q

What is a fixed interval schedule? Provide an example.

A

Reinforcer available after N sec/minute.
• i.e. know that they will be ‘rewarded’ every time after responding and waiting a certain elapsed time

(e.g. “clockwatching” at work – check the time every once in a while, towards the end check more often, more vigorously until it reaches the time)

37
Q

What is a variable ratio schedule? Provide an example.

A

Reinforcer available on average after N sec/min.
• i.e. don’t know how long to wait before they will be ‘rewarded’

(e.g. nagging)

38
Q

What is the difference between ratio and interval schedules? Between fixed and variable schedules? Between continuous and partial reinforcement?

A
  • Ratio – reinforcement is based on the number of times the behaviour occurs
  • Interval – reinforcement is based on a specific unit of time
  • Fixed – reinforcement is consistently provided upon each occurrence
  • Variable – reinforcement is supplied at different rates or at different times
  • Continuous – reinforce desired behaviour each time it occurs
  • Partial – reinforce desired behaviour intermittently
39
Q

What is stimulus control?

A

The stimuli in the environment control our behaviour.
i.e. instrumental behaviours are “controlled” by stimuli with which they are associated.

E.g. PECK or TURN – pigeon changes what it is doing based on different stimulus is presented. Learns to distinguish between peck and turn.

40
Q

What is generalisation?

A

The extent to which behaviour transfers to new stimulus.

i.e. when stimuli that are similar but not identical to the conditioned stimulus produce the conditioned response.

41
Q

What is a discriminative stimulus?

A

Any environmental cue that informs you about how you should act.

42
Q

What is discriminative learning?

A

Differentiate between two similar stimuli if one is consistently associated with the unconditioned stimulus and the other is not. It can be done through training with different schedules of reinforcement. At first, unable to differentiate but over time, able to make connection and discriminate.

43
Q

How does stimulus control link to Thorndike’s Law of Effect? S-R learning?

A

According to Thorndike, satisfying outcomes strengthen connections between the stimuli/context and the response. This leads to formation of habits.

S = stimuli; R = response.

44
Q

What is Skinner’s Tripartite Contingency?

A

Essentially, the relationship between stimuli, responses and reinforcers.

A = Antecedent = stimulus controlling behaviour (Sd – discriminative stimulus)
B = Behaviour = the response (R)
C = Consequence = reinforcing stimulus (Sr or Rft)
45
Q

What is discrimination?

A

The extent to which behaviour DOES NOT transfer to new stimulus.

46
Q

What did Watson & Rayner (1920) do to Little Albert? What did it show?

A

Wanted to test generalisation of learned fear in an infant. Watson gave Little Albert a white rat (CS) to play with. Behind him, Watson made a loud clanging noise (US) every time he touched the rat. It elicits fear and shocks Albert– upsets him and makes him cry (UR). Conditioned to fear the white rat (CR). But this fear was generalised to other animals and similar objects – fear of furry things, even cotton balls on a santa mask.

47
Q

What did Rachman (1966) do to test acquisition of a sexual fetish by classical conditioning?

A

Got male colleagues and paired leather black boots ( CS) and paired it with pornography (US) — measured physiological sexual arousal (UR; CR).

Results: found that it could be acquired, extinguished, and spontaneous recovery too. Also generalisation – to similar black low/high-heeled shoes.

48
Q

What does the generalisation gradient show?

A

That generalisation is highest for physically similar stimuli. As test and conditioned stimuli share less in common, generalisation decreases.

49
Q

How did Marks & Gelder (1967) remove sexual fetish by aversive conditioning?

A

Paired sexual fetish objects (CS) but paired with electric shock (US). Also imaginary fetishizing of objects (CS) with real electric shock (US). Shocked 75% of time.

Results: counter conditioning did work to remove it but after 5 weeks strong relapse.

50
Q

What is social learning?

A
  • Changing own behaviour after observing behaviour of others
  • Acquiring new/altered behaviours through observation of others’ actions and their consequences

As opposed to instrumental and classical learning where it is from direct experience that behaviour changes.

51
Q

What other social processes affect learning (but are not considered social learning)?

A

Social facilitation ≠ social learning.

Goal enhancement = getting access to some wanted goal might facilitate later trial and error learning, e.g.

Stimulus enhancement = observe others and are often more likely to approach places that they are

Increased motivation to act = try more new things in company of friends/parents

52
Q

How can classical conditioning occur by observation?

A

Behaviour of others acts as US that supports classical conditioning.

E.g. lab-raised monkeys (observer – UR/CR) with no innate fear of snakes (CS) sees wild monkey (performer – US) afraid of snake, observer acquires fear of snakes.

53
Q

What did Cook & Mineka (1991) discover from studying rhesus monkeys?

A

Tested behaviour of observers in presence of CSs and similar toys. In follow-up, fear is still specific to snake-like stimuli. Suggests biological prepardness to learn some things but not others — couldn’t condition fear of flowers but could with snakes.

54
Q

What is mimicry?

A

Social (instrumental) learning – least cognitively sophisticated.

  • Copying WITHOUT reference to a goal
  • E.g. young babies don’t really understand reward but will copy – sneezing babies
55
Q

What is emulation?

A

Social (instrumental) learning.

  • Understanding that there is a goal but not using the same method to gain access to the goal
  • E.g. chimpanzees obtaining food with a rake – don’t understand connection between method and goal/reward – couldn’t use rake properly
56
Q

What is imitation? How can it be measured?

A

Social (instrumental) learning – most cognitively sophisticated.

  • Copying WITH reference to a goal — understanding that this action leads to this reward;
  • Two-action test – two effective methods of doing one action – others watch and copy the action — suggests imitation.
57
Q

What is modeling?

A

Similar to imitating except it is not only copying specific behaviour but adopting their general styles of behaviour (e.g. aggressive vs gently play).

Coined by Bandura.

58
Q

What did Bandura (1965) test with the bobo doll and what were the results?

A

Look at influence of reinforcement and modelling. Children observing model on TV where either model was rewarded, model was punished or no consequences.

Modeling is reinforcement dependent. Modeling can occur through TV, not just in person.

59
Q

What is Bandura’s social cognition theory?

A
  1. Attention to model
  2. Remember model’s action
  3. Ability to reproduce action
  4. Motivation to reproduce action
60
Q

What affects conditioning (4)?

A
  • Frequency
  • Salience
  • Contiguity
  • Contingency
61
Q

Define contiguity. How does it affect conditioning?

A

Refers to timing – time between onset of CS and US.

Need inter-stimulus interval (ISI) = time between ‘CS on’ and ‘US on’.

Simultaneous conditioning – not effective due to competing attention.

62
Q

Define frequency. How does it affect conditioning?

A

Frequency: number of event pairings.

More event pairings —> more learning. (Until it hits a maximum limit and asymptotic.)

63
Q

Define salience. How does it affect conditioning?

A

Salience aka Intensity aka how salient/noticeable is the CS / Sd + US / Reinforcer.

More intense CS/Sd —> faster learning.

More intense US/Rft —> greater amount of learning.

64
Q

Define contiguity. How does it affect conditioning?

A

Refers to timing – time between onset of CS and US.

Closer together —> better learning.

Need inter-stimulus interval (ISI) = time between ‘CS on’ and ‘US on’. Simultaneous conditioning – not effective due to competing attention.

65
Q

Define contingency. How does it affect conditioning?

A

About whether or not the signal/cue/action (CS) changes the probability of the US. i.e. how likely is US follow CS (how likely is pain from flower?) what is probability that US going to occur anyway? (Pain all the time??)

Higher contingency —> better learning.

66
Q

Define contingency. How does it affect conditioning?

A

About whether or not the signal/cue/action (CS) changes the probability of the US. i.e. how likely is US follow CS (how likely is pain from flower?) what is probability that US going to occur anyway? (Pain all the time??)

Higher contingency —> better learning.

67
Q

What is learning?

A

Enduring (relatively stable) change within organism because of experience (practice, previous trials, history).

68
Q

How can learning and performance differ?

A

Learning ≠ performance.

Performance is affected by learning but is also impacted by opportunity, motivation, sensory/motor capabilities. Change in performance does not always reflect changes in learning.

69
Q

What is not learning?

A

LEARNING IS NOT:
• Reflexes – changes in behaviour are innate not from exp
• Instincts – changes in behaviour are also genetic, more complex than reflexes
• Maturation – changes in behaviour bc of aging
• Fatigue – changes in behaviour not stable

70
Q

What are reflexes?

A

Reflexes are automatic, usually very fast (bc involve the fewest neurons out of any type of behaviour). Learning is not required. Just involves eliciting stimuli and corresponding response.

E.g. Rooting Reflex – breastfeeding, stimulus around the face, turns its head;

E.g. Moro Reflex – response to baby feeling its weight falling;

71
Q

What are instincts?

A

Instincts are behavioural sequences made up of units which are largely genetically determined. Typical of members of a species. Learning is not required. More complex than reflexes.

E.g. Mating rituals

72
Q

What is maturation?

A

Changes that take place in body / in behaviour because getting older. Learning is not required.

E.g. learning to walk – get older, develop physically enough to be able to walk.

73
Q

What is fatigue?

A

A transient state of discomfort, loss of efficiency bc emotional strain, tired, bored, etc. Leads to physucal inability to perform learned response — NOT evidence for lack of learning.

74
Q

What are the two phenomena that do count as examples of learning?

A

Habituation = decreased responding produced by repeated stimulation

  • (e.g. rat jumping less the more it hears the same loud noise)
  • not fatigue / not sensory adaption (organs become temporarily insensitive to stimulation)

Sensitisation = increased responding produced by repeated stimulation
- (e.g. rats run more in response to same amount of cocaine than when pre-exposed to cocaine)

Both help up to organise/focuse our behaviour. Choose what to ignore and what to respond to.

75
Q

What is motivation?

A

Why individuals initiate, choose or persist in specific actions in specific circumstances.

  • necessary condition of behaviour
  • has an energising effect on behaviour
  • temporary state that can vary over time (i.e. dif from learning)
76
Q

What is Hebb’s analogy for motivation, behaviour and learning?

A

Compares it to driving a car. Engine = motivation; it provides power.
Steering = innate/learned stuff; determines direction.
Both lead to movement of car = behaviour of the individual.

77
Q

What is a fixed action pattern?

A

The same behaviour is displayed by all members of the species in response to the same stimulus.

— Set sequence of behaviours, NOT reflex, more complex than reflex
— Elicited by combo of environmental and biological circumstances (breeding season, development, nesting).

78
Q

Explain sign stimuli and supernormal stimuli using Tinbergen’s stickleback fish and nest birds as a example.

A

• Sign stimuli – a cue/sign something triggers/initiates a fixed action pattern.
- E.g. Tinberg’s experiment on stickleback fish — red bellies elicit aggressive behaviour in male sticklback – stereotyped behaviour.

• Supernormal stimuli — exaggerated features elicit stronger responses —
- E.g. nesting birds — even if the fake egg is ridiculously massive but still looks like one, the bird will abandon its other eggs and go and sit on this one. Because the exaggerated features elicit a stronger response.

79
Q

What are drives?

A

Drive is some form of internal system that organises behaviour towards one goal.

E.g. food, admiration of others.

80
Q

How do drives motivate behaviour (according to Clark Hull)?

A

Essentially, reinforcement = drive reduction.

Organism suffer deprivation —> deprivation produces needs (to maintain homeostasis) —> needs activate drives —> drives activate behaviour —> behaviour determined by learning —> reduction of drive reinforcing.

81
Q

How do drives relate to habit formation?

A

A behaviour that reduces drive will be reinforced and associated with the situation (Sd): S-R “habit”. A learned habit is performed without consideration of the value of the reinforcer. Drive “energises” habits.

82
Q

What types of biological sources of motivation?

A
  • Proximal — maintaining homeostasis; facilitating survival/safety of organism.
  • Distal — maintaining reproductive success; facilitating survival of species.
83
Q

What are the limitations of drive theories?

A
  • Drive reduction is not necessary for — e.g artificial sweeteners.
  • Stimulating a drive can be reinforcing — can still reinforce without satisfying need;
  • Ignores role of qualitative differences between reinforces (e.g. liking different flavours – drive reduction doesn’t explain our personal preferences)
84
Q

What is incentive motivation?

A

Incentive motivation focuses on rewards “attracting”/pulling the subjects behaviour. (As opposed to drive theory being about reducing negative internal stimulus).

Requires understanding of incentive value beyond liking, biological needs, current arousal, preferred activity.

85
Q

What is delayed reward discounting?

A

Delay discounting or delayed gratification of reward —— linked to duration you are expected to wait for the reward –– a cognitive property;

Rewards display economic principles:
— value of rewards decreases with delay
— choices sensitive to financial needs
— correlated with impulsivity & disorders of abuse

Remember: marshmallow kids.

86
Q

What are projection tests?

A

Pioneered by Henry Murray (1938) — projection tests are used to study people’s needs and goals. Rorschach test (butterfly things) or thematic apperception tests (TAT).

People asked to describe ambiguous image — description analysed for themes. Assumption: people’ preoccupations, needs, drives & goals projected in their interpretations.

87
Q

What are the long-term human needs?

A

Intrinsic sources of motivation — long-term human needs:

nAch — achievement
nPower — acquisition of power
nApproval — acquisition of other’s approval + respect
nAffiliation — acquisition of other’s love and support