Lec. 14 (operant conditioning) Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

classical or operant?
organism learning associations between events it does NOT CONTROl (tone, salivating reflex)

A

classical conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

classical or operant?
organism learning associations between its OWN behavior and RESULTING events

A

operant conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

the strengthening of behaviors through CONSEQUENCES (ex: switching a light switch on and then lights turn on)

A

operant conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

BF SKINNER =

A

operant conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

operant conditioning started with who?

A

Thorndike

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

put cats in boxes and determined how they learned – PUZZLE BOXES

A

Thorndike

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Thorndike + BF Skinner were both ________

A

behaviorists

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

focused on learning and OBSERVING animals

A

behaviorists

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

if a response made to a particular stimulus is followed by satisfaction, that response is more likely to occur the next time the stimulus is present

A

Thorndike’s Law of Effect

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

principle Thorndike used/dexplained through his Law of Effect

A

instrumental conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

thorndike used ___________ to explain instrumental conditioning

A

puzzle pox

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

skinner extended thorndikes law of effect/instrumental conditioning by saying that an organism learned a response by ________ on the environment

A

operating

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

says “consequences shapes behavior”

A

operant condioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

using Thorndikes law of effect as a starting point, Skinner developed the ________ to study operant conditioning

A

skinner box (rats in boxes)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

skinner boxes were also called wha?

A

operant chambers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Thorndike =
Skinner =

A
  • cats
  • rats
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

a reponse/behavior that has some effect on the world

A

operant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

a stimulus event that INCREASES the probability that the operant behavior will occur again

A

reinforcer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

T/F: reinforcer = punishment

A

false

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

PLEASANT stimulus that when given strengthens the response if it follows that response

A

positive reinforcer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

an UNPLEASANT stimulus that – if REMOVED – strengthens the response that removes the stimulus (something bad gets taken away)

A

negative reinforcer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

both positive and negative reinforcers ________ responses

A

STRENGTHENS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

reinforcements will always _______ the likelihood that the operant will occur again

A

increase

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

getting a hug; receiving a paycheck =

A

positive reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

fastening seatbelt to turn off beeping sound =

A

negative reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

TYPES of reinforcers (2):

A
  • primary
  • secondary
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

type of reinforcer: events or stimuli that satisfy needs basic to SURVIVAL (ex: food, water, shelter – candy/calories)

A

primary

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

type of reinforcer: rewards that people or animals LEARN to like (ex: money for adults, praise)

A

secondary

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

secondary reinforcers are sometimes called what?

A

“conditioned reinforcers”

30
Q

TIMING of reinforcers (2):

A
  • immediate
  • delayed
31
Q

timing of reinforcer: rat gets food after pressing a button

A

immediate

32
Q

timing of reinforcer: paycheck arrives after two week; effect may be WEAKENED

A

delayed

33
Q

with delayed reinforcers, the effect may be ______

A

weakened

34
Q

process of reinforcing successive approximations to the target behavior (each approximate desired behavior that is demonstrated is reinforced, while behaviors that are not approximations of the desired behavior are not reinforced)

A

shaping

35
Q

how OFTEN you provide reinforcement

A

schedules of reinforcement

36
Q

types of schedules of reinforcement (2):

A
  • continuous
  • partial/intermittent
37
Q

type of schedule of reinforcement: reinforcer is delivered EVERY time a particular response occurs

A

continuous

38
Q

type of schedule of reinforcement: reinforcement is given only some of the time

A

partial/intermittent

39
Q

TYPES of PARTIAL reinforcement schedules (2):

A

1) response-based
2) time-based

40
Q

type of partial reinforcement: reinforcement based on number of desired behaviors

A

response-based

41
Q

type of partial reinforcement: reinforcement based on TIME

A

time-based

42
Q

TYPES of response based partial reinforcement (2):

A
  • fixed ratio (FR)
  • variable ratio (VR)
43
Q

TYPES of response-based partial reinforcement (2):

A
  • fixed ratio (FR)
  • variable ratio (VR)
44
Q

type of response-based partial reinforcement: fixed number of responses required for reinforcement

A

fixed ratio (FR)

45
Q

type of response-based partial reinforcement: number of responses required for reinforcement varies around an average

A

variable ratio

46
Q

TYPES of time-based partial reinforcement schedules (2):

A
  • fixed interval (FI)
  • variable interval (VI)
47
Q

type of time-based partial reinforcement: fixed set of time must elapse (is predictable) before next opportunity for reinforcement

A

fixed interval (FI)

48
Q

type of time-based partial reinforcement: time interval that must elapse before next opportunity for reinforcement varies/is unpredictable

A

variable interval (VI)

49
Q

partial reinforcement schedule ex: free coffee after 10 visits; “10th caller” in a radio contest

A

FR (fixed ratio)

50
Q

partial reinforcement schedule ex: lottery, gambling, slot machines

A

VR (variable ratio)

51
Q

partial reinforcement schedule ex: UPS delivery of your new gadget, studying for an upcoming test

A

FI (fixed interval)

52
Q

partial reinforcement schedule ex: email “ding,” an unexpected pop quiz

A

VI (variable interval)

53
Q

the presentation of an AVERSIVE stimulus or the REMOVAL of a pleasant one following some behavior; always results in the DECREASE in the frequency of a response

A

punishment

54
Q

______ always results in an INCREASE in freq. of a response and _______ results in a DECREASE in freq. of a response

A

reinforcer; punishment

55
Q

negative reinforcement always ________ behavior and punishment always ________ behavior

A

strengthens; weakens

56
Q

2 ways to punish / decrease behavior…

A
  • administer an aversive stimulus (spanking; parking ticket)
  • withdraw a desirable stimulus (time out from privileges; revoked driver’s liscense)
57
Q

T/F: in general, reinforcers are much better at changing behavior than punishments

A

true

58
Q

drawback of punishments (4):

A
  • does not “erase” an undesirable habit, merely suppresses it
  • must be give IMMEDIATELY after undesirable behavior
  • can become aggression, even abuse, when given in anger
  • signals what is inappropriate behaviors but does not specify correct alternative behavior
59
Q

challenges to behavioral view of classical and operant conditioning; argued that learning may result from not only automatic associations but also from MENTAL PROCESSES; says learning is more than just associations, reinforcements, and punishment

A

cognitive processes (in learning)

60
Q

in cognitive maps, the big change in learning for rats who were given reinforcement (cheese) after trial 11 displays what?

A

latent learning

61
Q

the subconscious retention of information without reinforcement or motivation; one changes behavior only when there is sufficient motivation later than when they subconsciously retained the information

A

latent learning

62
Q

__________ said latent learning was impossible since it occurs in the mind and you couldn’t study it

A

behaviorists

63
Q

the discovery of latent learning was the beginning of the end of _____________ since it obviously had limits; began studying the MIND and INTRINSIC motivation

A

operant conditioning

64
Q

the desire to perform a behavior for its own sake (ex: to be successful)

A

intrinsic motivation

65
Q

the desire to perform a behavior due to promised rewards or threats of punishments (ex: money, time-out)

A

extrinsic motivation

66
Q

higher animals, especially humans, learn through ________ and ________ others

A

observing + imitating

67
Q

learning by OBSERVATION begins as early as ________ in children (ex: imitates the adult on TV pulling a toy apart)

A

14 months

68
Q

showed that children in elementary school who are exposed to violent TV, videos, and video games express increased aggression

A

Gentile et al.

69
Q

T/F: violent TV and video games does not DIRECTLY cause violence in children, but it does not help (ex: may imitate it)

A

true

70
Q

T/F: research shows that viewing media violence leads to an increased expression of aggression

A

true