Exam 4 (Ch 7-9) Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

There are three events to consider in an analysis of instrumental learning, those are?

Skinner described instrumental conditioning in terms of a three-term contingency involving __ - __ - __

A
  1. The stimulus context (S)
  2. The instrumental response (R)
  3. The response outcome (O; AKA reinforcer)

S-R-O

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the different S-R-O associations that exist?

Be able to explain and understand each association

A
  1. Law of effect by Thorndike; S-R association, with no learning about O (Characterizes habitual habits, becomes less about the goal so consequences become irrelevant; example: drug addiction)
  2. Pavlovian or Classical conditioning; S-O association (Positive or negative strengthening in association, also acts as reward expectancy or emotional state in instrumental conditioning)
  3. Instrumental conditioning; S-R association (positive or negative strengthening in behavior)
  4. Instrumental conditioning; R-O association (usually involves devaluing the reinforcer after conditioning; AKA reinforcer devaluation procedure, also used to understand drug-seeking behavior)
  5. Two Process Theory; S-R-O associations (Pavlovian CS can influence or motivate instrumental responding)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How can the Two Process Theory be tested?

A

Pavlovian Instrumental Transfer Experiment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

The following is an example of what type of experiment and shows which association?

Phase 1
Instrumental: Lever pressing is reinforced by food
Lever press -> Food

Phase 2
Pavlovian: the response lever is removed from the experimental chamber and a tone is paired with food
Tone -> Food

Transfer Test
Present Pavlovian CS: participants are again permitted to perform the instrumental lever-press response, but now the Pavlovian CS (tone) is presented periodically.

Lever press -> Food
Tone vs. No Tone

Results
1. There tends to be an increase in lever pressing during the tone compared to when there’s no tone
2. Suggests there is a Pavlovian S-O connection influencing the instrumental response

A
  1. Pavlovian Instrumental Transfer Experiment
  2. S-R-O connection
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

The following is an example of what type of experiment and shows which association?

Phase 1
Instrumental: Rats trained to press either response lever reinforced by artificially sweetened water, eventually water is switched to ethanol
Lever press -> Sugar water
Lever press -> Ethanol

Phase 2
Pavlovian:
Test Group: the response levers are removed from the experimental chamber and above each lever is a light appeared for 10 seconds paired with ethanol
Control Group: light presents first, then ethanol is presented 10 seconds later
Light -> Food

Transfer Test
Present Pavlovian CS: participants are again permitted to perform the instrumental lever-press response for ethanol reinforcement, but now the Pavlovian CS (light) is presented periodically. (some trials appeared on the left and some on the right)
Lever press -> Ethanol
light left or right side

Results:
1. The rats pressed each response lever about twice per minute before the CS was presented.
2. For the unpaired group, lever pressing did not change much when the CS was presented either on the right or the left.
3. The paired group showed a significant increase in lever pressing during the CS period if the CS was presented on the same side as the lever the rat was pressing.
4. These results show that a Pavlovian CS for ethanol will increase instrumental responding reinforced by ethanol.
5. The increased lever pressing during the CS shows that an independently established S–O association can facilitate instrumental responding reinforced by that outcome.

A
  1. Response Interactions in Pavlovian Instrumental Transfer Experiment
  2. S-R-O Connection
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

The following is an example of which procedure and shows which association?

Phase 1
Training: A two-choice concurrent schedule of reinforcement was used. The two responses were pressing two different keys on a computer keyboard. The cigarettes and chocolate bars earned were summed across trials, and the corresponding number of each item was placed in a basket on the participant’s desk.
button 1 -> 1/4 picture of a cigarette
button 2 -> 1/4 picture of a chocolate bar

Phase 2
Outcome Devaluation: value of outcome is reduced by satiating the participants with the corresponding reinforcer.
Group 1: Smoking a cigarette is reduced in value (Smoke an entire cigarette)
Group 2: Eating chocolate is reduced in value (Ate up to 8 chocolate bars in 10 minutes)

Test
participants were again tested on the concurrent schedule but this time they were told that although they would continue to earn cigarettes and chocolate bars, they would not find out how many of each they obtained until the end of the session. This was intended to maintain responding on the basis of the current status of the memory of each reinforcer.
Button 1?
Button 2?

Results:
1. During training, about 50% of the responses were made on the cigarette key, with the remaining responses performed for chocolate bars.
2. This indicates that the two outcomes were equally preferred before devaluation.
3. When the tobacco outcome was devalued, responding on the cigarette key significantly declined.
4. In contrast, when the chocolate outcome was devalued, responding on the cigarette key increased, indicating a decline in the chocolate response.
5. Thus, devaluation produced a decline in behavior specific to the response whose reinforcer had been devalued.

A
  1. Reinforcer Devaluation Procedure
  2. R-O association
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

A principle that assumes that reinforcement depends on how much more likely the organism is to perform the reinforcer response than the instrumental response before an instrumental conditioning procedure is introduced. The greater the differential probability of the reinforcer and instrumental responses during baseline conditions, the greater is the reinforcement effect of providing opportunity to engage in the reinforcer response after performance of the instrumental response.

AKA

A
  1. Premack Principle
  2. Differential Probability Principle
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

A theory that assumes that species-typical consummatory responses (eating, drinking, and the like) are the critical features of reinforcers.

A

Consummatory-response theory

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

An explanation of reinforcement according to which restricting access to a response below its baseline rate of occurrence ________ ________ is sufficient to make the opportunity to perform that response an effective positive reinforcer.

A

Response-deprivation hypothesis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

The preferred distribution of an organism’s activities before an instrumental conditioning procedure is introduced that sets constraints and limitations on response allocation.

A

Behavioral bliss point

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

A model of instrumental behavior, according to which participants respond to a response–reinforcer contingency in a manner that gets them as close as possible to their behavioral bliss point.

A

Minimum-deviation model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

The relation between how much of a commodity is purchased and the price of the commodity.

A

Demand Curve

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

The degree to which price influences the consumption or purchase of a commodity. If price has a large effect on consumption, elasticity of demand is high. If price has a small effect on consumption, elasticity of demand is low.

A

Elasticity of demand

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Given two responses of different likelihood, H and L, the opportunity to perform the higher probability response (H) after the lower probability response (L) will result in reinforcement of response L. (L ® H reinforces L.) The opportunity to perform the lower probability response (L) after the higher probability response (H) will not result in reinforcement of response H. (H ® L does not reinforce H.)

This describes what?

A

The Premack Principle

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q
  1. drinking a drop of sucrose is a high-probability response, and as one might predict, sucrose is effective in reinforcing lever pressing.
  2. Running in a running wheel is also a high-probability response in rats. Thus, one might predict that running would also effectively reinforce lever pressing.

Are examples of what?

A

The Premack Principle

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How can the premack principle be used in a clinical setting and what did it encourage?

A
  1. Schizophrenia offered incentive to sit down only if they worked on project
  2. Autism
  3. Encouraged thinking about reinforcers as responses rather than as stimuli.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

How we choose to spend our time is based on trade-off’s and can be explained by what perspective?

A

The Response Allocation Approach

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What are determinants of the elasticity of demand?

A
  1. Available Alternatives
  2. Relative Price
  3. Income
  4. Complementary commodities
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

A stimulus that controls the performance of instrumental behavior because it signals the availability (or nonavailability) of reinforcement.

A

Discriminative stimulus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Responding to test stimuli that are different from the cues that were present during training.

A

Stimulus generalization

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

A gradient of responding that is observed if participants are tested with stimuli that increasingly differ from the stimulus that was present during training.

A

Stimulus generalization gradient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Training with a stimulus discrimination procedure that results in stimulus discrimination.

A

stimulus discrimination training

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Interference with the conditioning of a stimulus because of the simultaneous presence of another stimulus that is easier to condition.

A

Overshadowing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

(in classical conditioning) A classical conditioning procedure in which one stimulus (the CS+) is paired with the US on some trials and another stimulus (the CS–) is presented without the US on other trials. As a result of this procedure, the CS+ comes to elicit a conditioned response and the CS–comes to inhibit this response.

(in instrumental conditioning) A procedure in which reinforcement for responding is available whenever one stimulus (the S+, or S D ) is present and not available whenever another stimulus (the S–, or S Δ ) is present.

A

Stimulus discrimination procedure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Differential responding in the presence of two or more stimuli.

A

Stimulus discrimination

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

A discrimination procedure in which reinforcement is provided when each of two stimuli appear by themselves (A+ and B+) but not when the two stimuli appear simultaneously (AB–).

A

negative patterning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

A discrimination procedure in which reinforcement is provided when two stimuli (A and B) are presented simultaneously (AB+) but not when those stimuli appear by themselves (A–and B–).

A

positive patterning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

A displacement of the highest rate of responding in a stimulus generalization gradient away from the S+ in a direction opposite the S–.

A

Peak-shift effect

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

This experiment illustrates several important ideas, including:

  1. Shows how to experimentally determine whether instrumental behavior has come under the control of a particular stimulus. Demonstrated by variations in responding related to variations in stimuli. AKA stimulus control. If an organism responds one way in the presence of one stimulus and in a different way in the presence of another stimulus, its behavior has come under the control of those stimuli.
  2. Differential responding to two stimuli also indicates that the pigeons were treating each stimulus as different from the other. AKA stimulus discrimination. If an organism does not discriminate between two stimuli, its behavior is not under the control of those cues.
  3. In the absence of special procedures, one cannot always predict which of the various stimuli an organism experiences will gain control over its instrumental behavior.

Trains pigeons to respond to red circle and white triangle

A

Experiment conducted by Reynolds (1961)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q
  1. Reinforced pigeons on a variable-interval schedule for pecking a response key illuminated by a yellow light with a wavelength of 580 nanometers (nm).
  2. After training, the birds were tested with a variety of other colors presented in a random order without reinforcement.
  3. The rate of responding in the presence of each color was recorded.

Results:
1. The highest rate of pecking occurred in response to the original 580-nm color. But the birds also made substantial numbers of pecks when lights of 570-nm and 590-nm wavelength were tested. This indicates that responding generalized to the 570-nm and 590-nm stimuli.
2. However, as the color of the test stimuli became increasingly different from the color of the original training stimulus, progressively fewer responses occurred.

This experiment is an example of what kind of stimulus reaction?

What does this look like on a graph and what is it called?

A
  1. stimulus generalization
  2. upside down “U”; stimulus generalization gradient
31
Q
  1. An excellent way to measure stimulus control because they provide precise information about how sensitive the organism’s behavior is to systematic variations in a stimulus
  2. Can provide precise information about how much of a change in a stimulus is required for the organism to respond differently.
  3. Steep indicates strong control of behavior by the stimulus dimension that is tested.
  4. Flat indicates weak or nonexistent stimulus control.
A

stimulus generalization gradient

32
Q
  1. Reinforced pigeons on a variable-interval schedule for pecking a response key illuminated by a yellow light with a wavelength of 580 nanometers (nm).
  2. After training, the birds were tested with a variety of other colors presented in a random order without reinforcement.
  3. The rate of responding in the presence of each color was recorded.

Results:
1. The highest rate of pecking occurred in response to the original 580-nm color. But the birds also made substantial numbers of pecks when lights of 570-nm and 590-nm wavelength were tested. This indicates that responding generalized to the 570-nm and 590-nm stimuli.
2. However, as the color of the test stimuli became increasingly different from the color of the original training stimulus, progressively fewer responses occurred.

What would happen to the stimulus generalization gradient graph if the pigeons were colorblind?

A

It would flatten out

33
Q

The following are examples of which stimulus response?

  1. A war veteran may feel heightened trauma when exposed to fireworks because they sound and look like explosions experienced in war.
  2. A parent who teaches their child to say ‘thank you’ at home will transfer that skill to other situations such as when their teacher gives them something in the classroom.
  3. Brands expand their product lines carefully, ensuring the new products look and feel like the original products to ensure users retain a sense of familiarity with the products.

more examples at:
https://helpfulprofessor.com/stimulus-generalization-examples/#:~:text=They%20have%20’generalized’%20their%20response,stimuli%20that%20also%20signify%20food.

A

Stimulus Generalization

34
Q

The following are examples of which stimulus response?

  1. Tuning a guitar or piano takes time. Repetition and practice will help your ear to learn to differentiate between small sound differences.
  2. Tonal languages like Mandarin have words that mean different things depending on your tone of voice. Learning what tone to use to convey the correct meaning requires repetition and practice.
  3. When we train a dog to understand the difference between “sit” and “roll”, we have successfully taught the dog to differentiate between two stimuli.
A

Stimulus discrimination

35
Q
  1. Presenting two stimuli
  2. Reward one over the other

Result:
1. Usually learn to respond to the rewarded stimuli

A

Discrimination Training

36
Q

What determines which of the numerous features of a stimulus situation gains control over the instrumental behavior?

A
  1. Sensory Capacity and Orientation
  2. Relative Ease of Conditioning Various Stimuli
  3. Type of Reinforcement
37
Q

the phenomenon that illustrates competition among stimuli for access to the processes of learning.

Example:
1. Trying to teach a child to read by having him or her follow along as you read a children’s book that has a big picture and a short sentence on each page.
2. Learning about pictures is easier than learning words.
3. Therefore, the pictures may well ________ the words.
4. The child will quickly memorize the story based on the pictures rather than the words and will not learn much about the words.

A

overshadowing; overshadow

38
Q

Study design that shows something more salient will have bigger impact on learning

Training stimuli:
Overshadowing group - aB
Control group - a

Test stimulus
Overshadowing group - a
Control group - a

Generalization
Overshadowing group - Decrement
Control group - Decrement

A

Overshadowing

39
Q

Group 1: Tone + Light -> Shock Avoidance
Group 2: Tone + Light -> Food

Test:
Tone?
Light?

Results:
1. Food reward paired better with tone
2. Shock avoidance paired better with light

Certain types of stimuli are more likely to gain control over the instrumental behavior in appetitive than in aversive situations.

This implies what about the stimulus?

A

Stimulus control of instrumental behavior is determined in part by the type of reinforcement that is used.

40
Q

How does generalization gradient change if training discrimination?

A

Becomes more narrow, symmetrically (on both sides)

41
Q

The orchestral sound originates from the sounds of the individual instruments. However, the sound of the entire orchestra is very different from the sound of any of the individual instruments, some of which are difficult to identify when the entire orchestra is playing. We primarily hear the configuration of the sounds created by all the instruments that are playing.

This is an example of what type of compound stimuli approach?

A

configural-cue

42
Q

What are the two different compound stimuli approaches?

A
  1. Stimulus-element approach
  2. Configural-cue approach
43
Q

An approach to the analysis of stimulus control which assumes that organisms respond to a compound stimulus as an integral whole rather than a collection of separate and independent stimulus elements.

A

configural-cue approach

44
Q

An approach to the analysis of control by compound stimuli which assumes that participants respond to a compound stimulus in terms of the stimulus elements that make up the compound.

A

Stimulus-element approach

45
Q
  1. Assumed that organisms treat the various components of a complex stimulus as distinct and separate elements.
  2. Presentation of a light and tone as consisting of separate visual and auditory cues.
  3. Has been dominant in learning theory going back about 80 years.
A

stimulus-element approach

46
Q

In discrimination stimulus procedures, the stimulus can be presented in different ways, they are:

A
  1. Separately
  2. Simultaneously
  3. Multiple Schedule of reinforcement
47
Q

These are all examples of which type of discrimination stimulus procedure (presentation)

  1. Playing a game yields reinforcement only in the presence of enjoyable or challenging partners.
  2. Driving rapidly is reinforced when you are on a freeway but not when you are on a crowded city street.
  3. Loud and boisterous discussion with your friends is reinforced at a party. The same type of behavior is frowned upon during a church service.
A

Multiple Schedules of Reinforcement

48
Q

Internal sensations produced by a psychoactive drug (or other physiological manipulation such as food deprivation)

A

Interoceptive Cues

49
Q

Discrimination Training can be used to manipulate and control

A
  1. Stimulus Control (S+,S-)
  2. Interoceptive Cues
  3. Compound and Configural Cues
50
Q

Experiment:
1. Four different CSs were used in the experiment (noise, tone, flashing light, and steady light).
2. Each CS presentation lasted 30 seconds and reinforced trials ended with the delivery of food into a cup.
3. Conditioned responding was nosing the food cup during the CS.
4. The assignment of the auditory and visual cues was arranged so that each compound stimulus (AB and CD) was made up of one auditory and one visual cue.
5. Training sessions consisted of six types of trials (A–, B–, AB+, C+, D+, and CD–) intermixed.

Results:
1. They learned to respond whenever A and B were presented simultaneously but not when each CS appeared alone.

This explains which discrimination procedure?

A

Positive patterning procedure

51
Q

Experiment:
1. Four different CSs were used in the experiment (noise, tone, flashing light, and steady light).
2. Each CS presentation lasted 30 seconds and reinforced trials ended with the delivery of food into a cup.
3. Conditioned responding was nosing the food cup during the CS.
4. The assignment of the auditory and visual cues was arranged so that each compound stimulus (AB and CD) was made up of one auditory and one visual cue.
5. Training sessions consisted of six types of trials (A–, B–, AB+, C+, D+, and CD–) intermixed.

Results:
1. They learned to withhold responding when C and D were presented simultaneously but responded to each of these cues when they were presented alone.

This explains which discrimination training procedure?

A

Negative patterning procedure

52
Q

Relating to the spatial configuration of something, such as a human face to be recognised, rather than its separate visual features

AKA not comparable

A

Configural

53
Q

The theory assumes that reinforcement of a response in the presence of the S+ conditions excitatory response tendencies to S+. By contrast, nonreinforcement of responding during S–conditions inhibitory properties to S–that serve to suppress the instrumental behavior. Differential responding to S+ and S–reflects both conditioned excitation to S+ and conditioned inhibition to S–.

This explains which theory?

A

Spence’s Theory of Discrimination Learning

54
Q

This effect is remarkable because it shows that the S+, or reinforced stimulus, is not necessarily the one that produces the highest response rate.

A

Spence’s explanation of the Peak-shift effect

55
Q

Language, for example, requires calling something an“apple”in response to a photograph of an apple, the presence of an apple on the kitchen counter, or the taste of an apple that you have just bitten into. These highly dissimilar physical stimuli all call for the same word.

is an example of?

A

stimulus equivalence

56
Q

In classical conditioning, _______ involves repeated presentations of the CS without the US. In instrumental conditioning, _______ involves no longer presenting the reinforcer when the response occurs.

A

Extinction

57
Q

Does extinction causing a reduction in behavior mean the behavior is being forgotten or new learning is occurring?

A

New learning

58
Q

How do we know that extinction is just new learning over forgetting the behavior?

A

4 forms of recovery from extinction

  1. Spontaneous Recovery
  2. Renewal of Conditioned Responding
  3. Reinstatement of Conditioned Responding
  4. Resurgence of Conditioned Behavior
59
Q

Reappearance of an extinguished response caused by the passage of time.

A

Spontaneous Recovery

60
Q

Reappearance of an extinguished response produced by a shift away from the contextual cues that were present during extinction. In ABA renewal, the shift is back to the context of acquisition. In ABC renewal, the shift is to a familiar context unrelated to either acquisition or extinction.

A

Renewal Effect

61
Q

Reappearance of an extinguished response produced by exposure to the US or reinforcer.

A

Reinstatement

62
Q

Reappearance of an extinguished response caused by the extinction of another behavior.

A

Resurgence

63
Q

Why is extinction paradoxical?

A

CS -> US
1. The stronger the association, the harder it should be to “undo,” so extinction should take longer
2. In many cases, it seems the opposite is true.

64
Q

What are the different paradoxical extinction effects?

A
  1. Partial-Reinforcement extinction effect (PREE)
  2. Partial Punishment extinction effect (PPEE)
  3. Magnitude of Reinforcement extinction effect (MREE)
  4. Overtraining extinction effect
65
Q

Less persistence of instrumental behavior in extinction following extensive training with reinforcement (overtraining) than following only moderate levels of training. This effect is most prominent with continuous reinforcement.

A

Overtraining extinction effect

66
Q

Less persistence of instrumental behavior in extinction following training with a large reinforcer than following training with a small or moderate reinforcer. This effect is most prominent with continuous reinforcement.

A

Magnitude of Reinforcement effect

67
Q

The term used to describe greater persistence in instrumental responding in extinction after partial (or intermittent) reinforcement training than after continuous reinforcement training.

A

partial reinforcement extinction effect

68
Q

The loss of a learned response that occurs because information about training is irrevocably lost due to the passage of time. Forgetting is contrasted with extinction, which is produced by a specific procedure rather than the passage of time.

A

Forgetting

69
Q

A theory of the partialreinforcement extinction effect, according to which extinction is slower after partial reinforcement because the instrumental response becomes conditioned to the anticipation of frustrative nonreward.

A

Frustration theory

70
Q

A theory of the partial-reinforcement extinction effect according to which extinction is retarded after partial reinforcement because the instrumental response becomes conditioned to the memory of nonreward.

A

Sequential theory

71
Q
  1. Number and Spacing of extinction trials
  2. Priming Extinction to Update Memory for Reconsolidation
  3. Conducting Extinction in Multiple Contexts

are all ways to

A

enhance extinction

72
Q

What is the best way to extinguish a bad behavior?

A
  1. Spaced out trials
  2. Many sessions
  3. Extinguish in many different contexts
73
Q

a behavioral response bias arising from discrimination learning in which animals display a directional, but limited, preference for or avoidance of unusual stimuli

A

“Peak Shift” effect