Chapter 6: Learning Flashcards

1.4, 6.1 - 6.13

1
Q

Three types of learning

A

nonassociative; associative; observational

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Nonassociative

A

learning about a stimulus in the external world
1. habituation
behavioural response to a stimulus decreases
2. sensitization
behavioural response to a stimulus increases

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Associative

A

learning the relationship between two pieces of information
1. Classical conditioning
when we learn that a stimulus predicts another stimulus
2. Operant conditioning
when we learn that a behaviour leads to a certain outcome

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Observational

A

learning by watching how others behave

  1. Modeling
    imitating a behaviour seen in others
  2. Vicarious learning
    learning to engage in a behaviour or not, after seeing others being rewarded or punished that action
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Classical Conditioning

A

type of learning in which a neutral stimulus acquires the capacity to evoke a response that was originally evoked by another stimulus
- first described and demonstrated by Ivan Pavlov in 1903

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Law of Effect

A

any behaviour that leads to a “satisfying state of affairs” is likely to occur again, and any behaviour that leads to an “annoying state of affairs” is less liekly to occur again

  • likelihood of the occurance of a behaviour is influenced by its consequences
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Operant Conditioning

A

voluntary responses come to be controlled by their consequences
- B.F. Skinner invented the operant chamber and theory

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

For punishment to be effective, we need to consider:

A

timing
intensity
consistency

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Schedules of Reinforcement:
continuous
fixed ratio
variable ratio
fixed interval
variable interval

explain what each means

A
  • continuous: every response is reinforce
  • fixed ratio: fixed # of responses must be made before reinforcement
  • variable ratio: random number of responses must be made before reinforcement
  • fixed interval: first response after a specific period of time has elapsed is reinforced
  • variable interval: first response after a random periods of time has elapsed is reinforce

what do each look like on a graph?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Extinction

A

in operant conditioning, extinction refers to gradual weakening and disappearance of a response tendency bc the response is no longer followed by reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Primary Reinforcers

A

things that are in themselves rewarding (innately satisfying, unlearned - biological)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Conditoned/Secondary Reinforcer

A

stimulus that gains its reinforcing power through assoc. with a primary reinforcer
- things we have learned to value - non-biologically beneficial

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Shaping

A

the reinforcement of closer and closer approximation of a desired response

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Dunning-Kruger Effect

A
  • form of illusory superiority
  • people lack the ability to evaluate their own performance in areas where they have little expertise
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Illusory Superiority

A

form of false confience, when we believe that we are above average in just about everything

How well did you know this?
1
Not at all
2
3
4
5
Perfectly