:earning Flashcards
Optical Illusions
Perception of features or objects that aren’t really there
Top-down processes impose a nonexistent object
Conditioning: Generalization
Once a response has been conditioned, similar stimuli can elicit the same response
- (e.g. different bell tones still produce salivation)
Conditioning: Discrimination
Ability to distinguish between similar but distinct stimuli.
- (e.g. if a dog shows less salivation to a different bell tone)
Behaviorism
All behaviors can be explained by conditioning
What does US stand for and what does it mean?
- Unconditioned Stimulus
- This is a stimulus that naturally and automatically triggers a response without any prior learning. For example, food is an unconditioned stimulus when it elicits salivation in dogs without any training.
What does CR stand for and what does it mean?
- Conditioned Response.
- This is the learned response to the previously neutral stimulus that has become conditioned. For instance, if a dog has been conditioned to associate the sound of a bell (previously a neutral stimulus) with being fed (the unconditioned stimulus), the dog’s salivation in response to the bell alone is the conditioned response.
Biological Preparedness
An organism’s evolutionary history can make it easier to learn particular associations
What is NS and what does is mean?
- Neutral Stimulus.
- This is a stimulus that initially does not elicit any particular response prior to conditioning.
Operant Conditioning
- Reward/punishment occurs ‘after’ behavior
- Type of learning in which an individual’s behavior is modified by its consequences
- They learn to operate something
- Example: dog learning a trick (operant) vs. dog spontaneously salivating in response to an unconditioned stimulus (classical conditioning)
Operant vs. Classical Conitioning
- Operant conditioning deals with the modification of “voluntary behavior” or operant behavior
- Classical conditioning deals with the conditioning of reflexive (reflex) behaviors
Thorndike’s Law of Effect
- Behaviors followed by favorable consequences become more likely
- Behaviors followed by unfavorable consequences become less likely
Positive Reinforcement
- Adding a favorable consequence to increase behavior
- Example: Getting a cookie for eating all of your veggies
Negative Reinforcement
- Removing an unfavorable consequence to increase a behavior
- Example: lowering insurance rates for safe driving
Positive Punishment
- Adding an unfavorable consequence to decrease a behavior
- Example: Getting a ticket for speeding
Negative Punishment
- Removing a favorable consequence to decrease behavior
- Example: No video games because of rudeness
Giovanni’s dog Luna won’t heel. To teach him to heel, Giovanni puts a choke chain and a leash on the dog somewhat tightly and goes for a walk. When Giovanni says “Heel” and Luna walks next to him, Giovanni loosens the choke chain. Now Luna heels much more often than before, due to
- Positive reinforcement
- Negative reinforcement
- Positive punishment
- Negative punishment
Negative reinforcement
Fred racked up a $200 cell phone bill from his texting last month, and his parents are furious. They take away his phone for two weeks to teach him that he must reduce his texting. Fred’s parents are using
- Positive reinforcement
- Negative reinforcement
- Positive punishment
- Negative punishment
Negative punishment
Jill gets mad when her roommate, Brenda, uses her stuff. Lately, when Jill catches Brenda using her stuff, Jill will play very loudly a song that Brenda hates. Now, Brenda is using Jill’s stuff much less, due to the effect of
- Positive reinforcement
- Negative reinforcement
- Positive punishment
- Negative punishment
Positive punishment
Shaping
Gradually modify or reward an animal’s behavior through a series of successive approximations of the target behavior.
Reinforcement Schedules
Behaviors conditioned using partial/intermittent reinforcement resist extinction longer than those that were condition under continuous reinforcement.
Intermittent Schedules: Fixed vs. Variable
- Fixed: Reinforcement after a given amount of time or response
- Variable: You are reinforced after an average amount of time has passed or a response has been given, reinforcement is less ‘predictable’
Intermittent Schedules: Interval vs. Ratio
- Interval: Based on TIME intervals
- Ratio: Based on the number of behaviors (i.e. ratio of responses to reinforcements)
Fixed Ratio
- Reward after a certain number of behaviors
- Example: earning a bonus for every 5 cars sold
Variable Ratio
- Reward after an average number of behaviors
- Example: Slot machines
Fixed Interval
- Reward after a certain amount of time
- Example: Paycheck every 2 weeks
Variable Interval
- Reward after an average amount of time
- Example: Receiving an email or message response
Which schedule is best?
- Best = most resistant to extinction
- Intermittent/partial reinforcement resists extinction more than continuous reinforcement
- Example: Broken vending machine vs. broken slot machine
- Variable ratio is most resistant to extinction
Timmy is trying to get a toy that comes in some boxes of breakfast cereal. He keeps opening boxes of cereal, knowing that if he opens enough boxes, he will eventually find a toy. Timmy is being reinforced on a _________ schedule.
- Fixed interval
- Variable interval
- Fixed ratio
- Variable ratio
Variable Ratio
Observational Learning
An organism learns from watching others
Classical Conditioning
Based on involuntary response