Chapter 5 Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

0What are the two types of conditioning

A

Classical- learning to link two stimuli in a way that helps us anticipate an event to which we have a reaction to (one of the stimuli needs to be natural, like salivating)
Operant- changing behavior choices in response to a consequence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the components of classical conditioning

A

UR- unconditional response (natural response)
US- Unconditioned stimulus (causes a response no matter what)
NS- Neutral stimulus (a stimulus with no conditioned response)
CS- Conditioned response ( a NS that is now conditioned to create a response like the US)
CR- conditioned response (a response to the CS, the UR becomes the CR)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

After conditioning the UR and NS —–

A

Become CR and CS, the US doesn’t change

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is social learning theory

A

A gender role theory proposed by Albert Bandura that suggests gender roles are created through punishment, reinforcements, modeling

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is observational learning

A

Learning by observing the behavior of others (is a component of social learning theory)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are social models

A

Typically of higher status of authority compared to the observer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are the four components that occurs in observational learning proposed by Albert Bandrua

A

Attention, retention, initiation, and motivation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is vicarious reinforcement

A

Learning that occurs by observing the reinforcement or punishment of another person

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is blocking

A

Occurs in classical conditioning and suggests that adding a new CS to an already established CS and CR has no effect. (Suggests that information, suprise value, and or prediction error is important in conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is prediction error

A

The chance that a CS won’t lead to the expected outcome

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is Stimulus control

A

When an operant behavior is controlled by a stimulus that proceeds it. (Ex. Waiting at a traffic light, you know that Green means go but only turn when you have a green arrow)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is discrimitative stimulus

A

In operant conditioning, a stimulus that signals whether the response will be reinforced, it “sets the occasion” for the operant response. (Ex putting a canvas in front of an artist doesn’t elicit painting behavior, instead it sets the occasion (or allows) for painting to occur

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is extinction

A

Refers to the diminishing of a CR (not erasing) when a CS presented consistently without the US

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is spontaneous recovery

A

The return of a CR despite the lack of further conditioning (occurs after extinction)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is renewal effect

A

Recovery of an extinguished response that occurs when the context is changes, like location, after extinction. (If the CS is tested in a new context, the CR can also return

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is Thorndikes law of effects

A

effects-behaviors followed by avorable consequences become more likely and behaviors followed by unfavorable consequences become less likely. (Edward Thorndike discovered
this by using cats, placed in a puzzle bored

17
Q

What are reinforcements

A

Feedback from the environment that makes a behavior more likely to occur again
(Positive reinforcement- adds something desirable, like praise)
(Negative reinforcement- ending something unpleasant, like an annoying noise)

17
Q

Who is B.F Skinner

A

I’m regard to operant conditioning, he experimented with the effect of giving reinforcements in different patterns or “schedules” to determine what led to the best results.

18
Q

What are the two types of reinforcers

A

Primary reinforcer- a stimulus that meets a basic need or otherwise intrinsically desirable (food, sex, fun, attention, ect)
Secondary reinforcer- a stimulus which has become associated with a primary reinforcer, like money (need money to buy food)

19
Q

What is the difference between continuous and partial/intermitted reinforcement

A

Continuous- giving reinforcements every time, results in quicker results but less likely to maintain
Partial/intermitted- giving reinforcements sometimes, results in a slower results but maintains longer

20
Q

What is the difference between negative and positive punishment

A

Negative- taking away something desirable
Positive- adding something undesirable

21
Q

Who is Ivan Pavlov

A

The first to study classical conditioning with his experiment that contained a dog, a bell, and a bowl of saliva. (His experiment aided in our understanding of classical conditioning)

22
Q

What is the difference between classical and operant conditioning

A

Operant involves voluntary behaviors (active particpant)
And classical involves natural behaviors (passive participant)

23
Q

What is taste aversion conditioning

A

The phenomenon in which a taste is paired with sickness, which causes the individual to reject- and dislike- that taste in the future

24
Q

What is fear conditioning

A

A type of classical conditioning in which the CS is associated with an averisve US, such as a foot shock. As a consequence of learning, the CS comes to evoke fear. This is thought to be associated with the development of anxiety disorders

25
Q

What is conditioned compensatory response

A

Involved in classical conditioning and is when a CR that opposes, rather than the same as, the UR. In such cases it functions to reduce the strength of the UR.

26
Q

When do punishments work best

A

In natural settings, like getting burned from reaching into a fire. Less effective when artistically created.

27
Q

What is the quantitative law of effect

A

A mathematical rule that states that the effectiveness of a reinforcer at strengthening an operant response depends of the reinforcement earned for all alternative behaviors. A reinforcement is less effective if there is alot of reinfocers in the environment for other behaviors.

28
Q

What is the reinforcer devaluation effect

A

The finding that an animal will stop performing an instrumental response that once led to a reinforcer if the reinforcer is seperatly made averisve or undesirable

29
Q

What is a habit

A

An operant behavior that occurs automatically in the presence of a stimulus and is no longer by the animals knowledge of the value of the reinforcer. Insensitive to the reinforcer devaluation effect.

30
Q

What is the stimulus, outcome, response chart detail

A

Stimulus ➡️ Outcome = classical conditioning
Stimulus ➡️ Response = Habit
Response ➡️ Outcome = Operant conditioning
Stimulus ➡️ Operant conditioning = occassion setting

31
Q

What is interval schedule

A

We may schedule our reinforcement based on the amount of time that has gone by

32
Q

What is a ratio schedule

A

We plan for a certain ratio of rewards per number of instances of the desired behavior (like moving 10 boxes, time does not matter)

33
Q

What is the difference between a fixed and a variable schedule

A

Fixed- doesn’t change, predictable
Variable- fluctuates, not predictable

34
Q

What does the following entail- fixed ratio schedule (FR), Variable ratio schedule (VR), Fixed interval schedule (FI), Variable interval schedule (VI)

A

FR- every so many behaviors (ex, getting paid for every ten boxed you move)
VR- after an unpredictable amount of behaviors (ex. Hitting the jackpot sometimes on a slot machine)
FI- every so often (ex getting paid weekly regardless of the work)
VI- unpredictably often (ex checking the cell phone all day, only sometimes getting a message)