Chapter 7 (final) Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

what does Learning involve

A

involves a relatively permanent change in one’s mental processes or behavior that is a function of interactions with the environment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

what did Ivan Pavlov do

A

Pavlovian conditioning with dogs and salivation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what did John B. Watson do

A

Pavlovian conditioning with Little Albert and fear

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

what did E. L. Thorndike do

A

operant conditioning with cats and puzzle boxes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

what did B. F. Skinner do

A

operant conditioning with pigeons and key pecking

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

what did Edward C. Tolman do

A

latent learning with rats in mazes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

what sis Albert Bandura do

A

social learning with children and Bobo dolls

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Pavlovian, respondent, and classical conditioning considered

A

terms used interchangeably, and they refer to situations when we have a signal that tells us when another event will occur, like a bell signals that food will be presented (we will use only Pavlovian when discussing this type of learning)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

what are Instrumental and operant conditioning

A

terms used interchangeably that refer to situations when the consequences of our behavior control future responding. If a pigeon pecks a key, then he will get food. This makes him peck the key more frequently to again get food

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Bandura developed a theory of what

A

social learning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Pavlov used the concepts of what

A

a stimulus and response, where a stimulus can be anything in the environment that is measurable and can evoke a response

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Pavlov also made a distinction between what

A

unconditional responses, which do not require learning, and conditional responses, which are a result of learning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

an unconditional stimulus (i.e., food) causes what

A

an unconditional response (i.e., salivation)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Pavlovian conditioning results when a______ is repeatedly paired with an unconditional stimulus so that the neutral stimulus becomes a_________

A

neutral stimulus; conditional stimulus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

what is extinction

A

After conditioning, if the conditional stimulus is presented without the unconditional stimulus, the strength of the conditional response decreases over time in a process called extinction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Pavlovian extinction occurs when what

A

a conditional stimulus is presented without the unconditional stimulus. Because the conditional stimulus acquired the ability to elicit by being paired with the unconditional stimulus, its capacity to elicit an involuntary response (the conditional response) decreases each time it is presented alone. After repeated presentation by itself, the conditional stimulus no longer causes the conditional response; the association between the conditional stimulus and the unconditional stimulus has been extinguished, and it is again a neutral stimulus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

The most effective pairing methods occur when what

A

the conditional stimulus precedes the unconditional stimulus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

what is appetitive stimulus

A

appetitive stimuli are usually considered to be pleasant (and are not always food related)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

what is aversive stimulus

A

aversive stimuli or events often involve some degree of physical discomfort and can include a loud noise, electrical shock, or a burn to the skin

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q
  1. “What involuntary response was caused or elicited in the example?”
A

The answer is both the unconditional response and conditional response

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

“What originally caused the involuntary response in the example?

A

The answer is the unconditional stimulus.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What stimulus/event was paired with the unconditional stimulus?”

A

The answer is the neutral stimulus before it is paired with the unconditional stimulus, and it is the conditional stimulus after it has been paired with the unconditional stimulus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

what is Stimulus Generalization

A

involves reacting in a similar manner to stimuli that share features associated with the original conditional stimulus. In other words, a different environmental event that has not been paired with the unconditional stimulus also elicits or causes the conditional response

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

what is Stimulus Discrimination

A

is a process that is the opposite of stimulus generalization. Unlike stimulus generalization, where conditional responses occur from exposure to stimuli that are physically similar to the original conditional stimulus, with stimulus discrimination, conditional responses only occur when the original conditional stimulus is introduced. Similar stimuli do not elicit a response.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

what is higher order conditioning

A

Pavlovian conditioning can also occur when a neutral stimulus is systematically and repeatedly paired with an existing conditional stimulus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

what is Biological preparedness

A

organisms are more predisposed for some neutral stimuli to become conditional than other neutral stimuli

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

who dd the little albert experiment

A

watson

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

what was the little albert experiment

A

fear conditioned a young child to white rats

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

what is Operant conditioning

A

describes situations in which we can choose among different options based on our previous experiences

30
Q

what is instrumental learning

A

Because the cats learned how to manipulate an instrument (i.e., the lever),

31
Q

what is law of effect

A

For Thorndike, the “effect” in law of effect referred to the consequences of behavior. The law of effect was twofold: (a) behaviors that yielded satisfying consequences are more likely to recur and (b) behaviors that result in discomfort are less likely to be repeated. Simply put, when we do things that lead to satisfaction, such as downloading our favorite artist’s music, those responses are more likely happen in the future; when we do behavior that lead to unpleasant outcome such as clicking on an email link that then crashes our computer, we are less likely to do them again

32
Q

In the 1930s, B. F. Skinner renamed instrumental learning as what

A

operant conditioning because the processes and principles associated with it are more complex than simply operating instruments such as levers in Thorndike’s puzzle boxes

33
Q

what is antecedent stimuli

A

Skinner, like Thorndike, recognized the importance of the environmental events that preceded behavior (antecedent stimuli)

34
Q

what are contingencies

A

“if-then” relationships between responses and their consequences to describe these situations so that we can predict future behavior

35
Q

what are the contingencies that skinner identified

A

Skinner explained that with reinforcement (there is positive and negative), the consequences of a response increase the probability of behavior, whereas punishment (there is positive and negative) decreases the likelihood of a recurrence of a behavior

36
Q

what is positive reinforcement

A

If some behavior produces a stimulus which leads to more of that same kind of behavior in the future, then it is positive reinforcement (positive because of the consequence and reinforcement because of the effect on behavior)

37
Q

what is negative reinforcement

A

If some behavior removes a stimulus which leads to more of that kind of behavior in the future, then it is negative reinforcement (negative because of the consequence removed and reinforcement because of the effect on behavior)

38
Q

what is positive punishment

A

If some behavior produces a stimulus which leads to less of that kind of behavior in the future, then it is positive punishment (positive because of the addition of the consequence and punishment because of the effect on behavior)

39
Q

what is negative punishment

A

some behavior removes a stimulus which leads to less of that kind of behavior in the future, then the procedure/process is negative punishment (which also is referred to as response cost)

40
Q

are all operant conditioning responses equal

A

nope

41
Q

It should be noted that negative reinforcement is not a part of (positive or negative) punishment. Why isn’t it?

A

Negative reinforcement involves an aversive stimulus (something you will work to avoid) like punishment, but the probability of behavior increases with reinforcement, not decreases (as in punishment), when the response removes the aversive stimulus. Thus, it is a reinforcement contingency

42
Q

what is Extinction

A

not a contingency in and of itself. It’s actually the absence of a contingency

43
Q

Negative reinforcement occurs in two forms what are they

A

escape and avoidance.

44
Q

what is Escape

A

is a situation in which the aversive stimulus is already present and a response removes or stops the otherwise ongoing aversive stimulus

45
Q

what is Avoidance

A

is a situation in which the aversive stimulus is not currently present but will occur unless you emit a response to cancel the scheduled aversive event

avoidance conditioning (i.e., the checkups allow for any problems to be treated before they become unpleasant).

46
Q

There are three behavioral effects of extinction what are they

A

(a) temporary increase in responding—an extinction burst, (b) emotional and aggressive responding, and (c) responding eventually stops

47
Q

what is partial reinforcement extinction effect

A

behavior exposed to a continuous reinforcement schedule will extinguish faster in extinction than behavior exposed to an intermittent reinforcement schedule

48
Q

what are Reinforcers

A

events or stimuli that follow behavior and increase the future likelihood of that kind of response

49
Q

what is the difference between reinforcers and reinforcement

A

(Some people use “reinforcers” and “reinforcement(s)” interchangeably; however, they are not the same: Reinforcers are stimuli and reinforcement is a process [i.e., how behavior changes] or a procedure [i.e., if this response, then this consequence].)

50
Q

Reinforcers are called positive if they_____; they are negative if they________

A

strengthen responses they follow; strengthen responses that lead to their removal

51
Q

Both positive and negative reinforcers are subdivided into whether they are

A

primary or secondary

52
Q

what is primary

A

Primary (or unconditioned) reinforcers are not learned; they naturally affect responses they follow. Primary positive reinforcers generally are stimuli/events needed to maintain life: food, water, air, and sleep.

53
Q

what is secondary

A

Secondary (or conditioned) reinforcers, both positive and negative, acquire their capacity to affect responses they follow because they have been associated or paired with primary or already-conditioned secondary reinforcers. As a result, an event that functions as a secondary positive reinforcer for one person may not be one for another person (“different strokes for different folks”), resulting from their different learning histories.

54
Q

There are five categories of secondary reinforcers what are they

A

Consumables. Exchangeables.
Tangibles
Social reinforcers activities;

55
Q

what are Consumables

A

are things we eat but do not need in order to live (chips, candy, gum, soda, etc.). Tangibles

56
Q

what are Exchangeables

A

such as money and tokens are used in bartering and have value because they can be used to purchase other reinforcers

57
Q

what are Tangibles

A

are physical objects that we can touch, such as clothes, cell phones, etc

58
Q

what are Social reinforcers

A

involve attention from others and include smiles, eye contact, verbal praise, proximity, etc.

59
Q

what are activities

A

these include going to a movie, listening to music, playing a game, etc.

60
Q

what is Premack principle

A

based on how often two behaviors occur. If behavior A occurs more frequently than behavior B, access to behavior A can be made contingent on first doing behavior B. While this may be the first time you have ever heard of the Premack principle, it is quite likely you have personally experienced it in your life. For example, when you were a kid and wanted to go out and play, your mom or dad may have invoked the Premack principle when one of them told you to, “First do your chores, then you can go play.” In this instance, the high-probability response is playing, and the low-probability response is doing one’s chores

61
Q

what is schedules of reinforcement

A

Collectively the schedules of reinforcement specify when a particular response triggers a consequence

62
Q

Schedules of reinforcement fall into two categories what are they

A

ratio and interval

63
Q

and each Schedules of reinforcement category has two sub-divisions what are they

A

fixed and variable

64
Q

interval schedules require what

A

that a specific amount of time elapse before an occurrence of the targeted response will trigger the delivery of a positive reinforcer

65
Q

what is social learning

A

This form of learning does not negate the behavioral approaches to learning. Instead, it expands human (and animal) learning into the cognitive domain

do not need reinforcement for learning to occur

just by watching others, children do as they do (if the adults are aggressive, the child will act aggressively on the doll and vice versa)

66
Q

who came up with social learning

A

Bandura

67
Q

Bandura’s theory specifies that observational learning entails four phases/processes/stages what are they

A

Attentional, Retention, Production, and Motivation

68
Q

what is attentional phase,

A

the learner must notice the behaviors being modeled
Additionally, not all models are equal. Observational learning is enhanced when the observer likes and respects the person doing the modeling

69
Q

what happens in the retention phase

A

the learner covertly practices and encodes the performance being observed.

70
Q

what happens in the production phase

A

the learner imitates the behavior observed

71
Q

what happens in the motivational phase,

A

the imitated performance is likely to trigger similar types of consequences (positive or negative) that may or may not have been observed during the attentional phase.

72
Q

what is learned helplessness

A

When the organism’s escape responses are ineffective at decreasing the painful events being experienced, the escape and avoidance responses eventually stop; the person has acquired learned helplessness.

For example, a student who is failing a math class may try several different strategies (e.g., longer and more frequent study sessions; using different study techniques, etc.) to escape/avoid failing the math class. If these attempts do not work, the student may simply stop trying to succeed in math (learned helplessness to a specific situation), while continuing to study for other classes