Lec. 14 (operant conditioning) Flashcards

1
Q

classical or operant?
organism learning associations between events it does NOT CONTROl (tone, salivating reflex)

A

classical conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

classical or operant?
organism learning associations between its OWN behavior and RESULTING events

A

operant conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

the strengthening of behaviors through CONSEQUENCES (ex: switching a light switch on and then lights turn on)

A

operant conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

BF SKINNER =

A

operant conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

operant conditioning started with who?

A

Thorndike

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

put cats in boxes and determined how they learned – PUZZLE BOXES

A

Thorndike

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Thorndike + BF Skinner were both ________

A

behaviorists

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

focused on learning and OBSERVING animals

A

behaviorists

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

if a response made to a particular stimulus is followed by satisfaction, that response is more likely to occur the next time the stimulus is present

A

Thorndike’s Law of Effect

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

principle Thorndike used/dexplained through his Law of Effect

A

instrumental conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

thorndike used ___________ to explain instrumental conditioning

A

puzzle pox

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

skinner extended thorndikes law of effect/instrumental conditioning by saying that an organism learned a response by ________ on the environment

A

operating

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

says “consequences shapes behavior”

A

operant condioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

using Thorndikes law of effect as a starting point, Skinner developed the ________ to study operant conditioning

A

skinner box (rats in boxes)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

skinner boxes were also called wha?

A

operant chambers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Thorndike =
Skinner =

A
  • cats
  • rats
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

a reponse/behavior that has some effect on the world

A

operant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

a stimulus event that INCREASES the probability that the operant behavior will occur again

A

reinforcer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

T/F: reinforcer = punishment

A

false

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

PLEASANT stimulus that when given strengthens the response if it follows that response

A

positive reinforcer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

an UNPLEASANT stimulus that – if REMOVED – strengthens the response that removes the stimulus (something bad gets taken away)

A

negative reinforcer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

both positive and negative reinforcers ________ responses

A

STRENGTHENS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

reinforcements will always _______ the likelihood that the operant will occur again

A

increase

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

getting a hug; receiving a paycheck =

A

positive reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
fastening seatbelt to turn off beeping sound =
negative reinforcement
26
TYPES of reinforcers (2):
- primary - secondary
27
type of reinforcer: events or stimuli that satisfy needs basic to SURVIVAL (ex: food, water, shelter -- candy/calories)
primary
28
type of reinforcer: rewards that people or animals LEARN to like (ex: money for adults, praise)
secondary
29
secondary reinforcers are sometimes called what?
"conditioned reinforcers"
30
TIMING of reinforcers (2):
- immediate - delayed
31
timing of reinforcer: rat gets food after pressing a button
immediate
32
timing of reinforcer: paycheck arrives after two week; effect may be WEAKENED
delayed
33
with delayed reinforcers, the effect may be ______
weakened
34
process of reinforcing successive approximations to the target behavior (each approximate desired behavior that is demonstrated is reinforced, while behaviors that are not approximations of the desired behavior are not reinforced)
shaping
35
how OFTEN you provide reinforcement
schedules of reinforcement
36
types of schedules of reinforcement (2):
- continuous - partial/intermittent
37
type of schedule of reinforcement: reinforcer is delivered EVERY time a particular response occurs
continuous
38
type of schedule of reinforcement: reinforcement is given only some of the time
partial/intermittent
39
TYPES of PARTIAL reinforcement schedules (2):
1) response-based 2) time-based
40
type of partial reinforcement: reinforcement based on number of desired behaviors
response-based
41
type of partial reinforcement: reinforcement based on TIME
time-based
42
TYPES of response based partial reinforcement (2):
- fixed ratio (FR) - variable ratio (VR)
43
TYPES of response-based partial reinforcement (2):
- fixed ratio (FR) - variable ratio (VR)
44
type of response-based partial reinforcement: fixed number of responses required for reinforcement
fixed ratio (FR)
45
type of response-based partial reinforcement: number of responses required for reinforcement varies around an average
variable ratio
46
TYPES of time-based partial reinforcement schedules (2):
- fixed interval (FI) - variable interval (VI)
47
type of time-based partial reinforcement: fixed set of time must elapse (is predictable) before next opportunity for reinforcement
fixed interval (FI)
48
type of time-based partial reinforcement: time interval that must elapse before next opportunity for reinforcement varies/is unpredictable
variable interval (VI)
49
partial reinforcement schedule ex: free coffee after 10 visits; "10th caller" in a radio contest
FR (fixed ratio)
50
partial reinforcement schedule ex: lottery, gambling, slot machines
VR (variable ratio)
51
partial reinforcement schedule ex: UPS delivery of your new gadget, studying for an upcoming test
FI (fixed interval)
52
partial reinforcement schedule ex: email "ding," an unexpected pop quiz
VI (variable interval)
53
the presentation of an AVERSIVE stimulus or the REMOVAL of a pleasant one following some behavior; always results in the DECREASE in the frequency of a response
punishment
54
______ always results in an INCREASE in freq. of a response and _______ results in a DECREASE in freq. of a response
reinforcer; punishment
55
negative reinforcement always ________ behavior and punishment always ________ behavior
strengthens; weakens
56
2 ways to punish / decrease behavior...
- administer an aversive stimulus (spanking; parking ticket) - withdraw a desirable stimulus (time out from privileges; revoked driver's liscense)
57
T/F: in general, reinforcers are much better at changing behavior than punishments
true
58
drawback of punishments (4):
- does not "erase" an undesirable habit, merely suppresses it - must be give IMMEDIATELY after undesirable behavior - can become aggression, even abuse, when given in anger - signals what is inappropriate behaviors but does not specify correct alternative behavior
59
challenges to behavioral view of classical and operant conditioning; argued that learning may result from not only automatic associations but also from MENTAL PROCESSES; says learning is more than just associations, reinforcements, and punishment
cognitive processes (in learning)
60
in cognitive maps, the big change in learning for rats who were given reinforcement (cheese) after trial 11 displays what?
latent learning
61
the subconscious retention of information without reinforcement or motivation; one changes behavior only when there is sufficient motivation later than when they subconsciously retained the information
latent learning
62
__________ said latent learning was impossible since it occurs in the mind and you couldn't study it
behaviorists
63
the discovery of latent learning was the beginning of the end of _____________ since it obviously had limits; began studying the MIND and INTRINSIC motivation
operant conditioning
64
the desire to perform a behavior for its own sake (ex: to be successful)
intrinsic motivation
65
the desire to perform a behavior due to promised rewards or threats of punishments (ex: money, time-out)
extrinsic motivation
66
higher animals, especially humans, learn through ________ and ________ others
observing + imitating
67
learning by OBSERVATION begins as early as ________ in children (ex: imitates the adult on TV pulling a toy apart)
14 months
68
showed that children in elementary school who are exposed to violent TV, videos, and video games express increased aggression
Gentile et al.
69
T/F: violent TV and video games does not DIRECTLY cause violence in children, but it does not help (ex: may imitate it)
true
70
T/F: research shows that viewing media violence leads to an increased expression of aggression
true