Chapter 6 Flashcards
what are the two major types of non associative learning
habituation, sensitization
what is habituation
getting used to something - constant noise that you dont notice after a while
what is sensitization
when our behavioural response to a stimulus increases
what are the two types of associative learning
classical conditioning and operant conditioning
what is classical conditioning
learning that a stimulus in an environment predicts a different stimulus
what is an example of classical conditioning
pavlov and his dog
continuous events
one occurring after the other
contingency
one thing happens and something else happens, you have to do the first thing before the second thing will happen
stimulus generalization
when an association has been learned, but now something similar to the metronome sound also signals food
what is stimulus discrimination
opposite of stimulus generalization - wheee animals stop responding to the stimulus if it is not exactly what they learned to make the response to.
how is operant conditioning different from classical conditioning
you are not required to signal anything for operant learning
what is happening in operant conditioning
the animal has to do something to get something done. they can be rewarded or punished for performing a specific behaviour
what does operant conditioning change
the frequency of the behaviour
who is associated with operant conditioning
BF skinner
what is needed for positive reinforcement
add pleasant stimulus to increase/maintain behaviour
wgat is needed for positive punishment
add averse stimulus to decrease behaviour
what is needed for negative reinforcement
remove averse stimulus to increase/maintain behaviour
what is needed for negative punishment
remove pleasant stimulus to decrease behaviour
what is continuous reinforcement
reinforce a behaviour each time it occurs
partial reinforcement
intermittent reinforcement of behaviour
what is a ratio schedule
based on the number of times the behaviour occurs
what is an interval schedule
based on a specific unit of time
what is fixed interval schedule
when reinforcement is provided after a certain amount of time has passed,
what is a variable interval schedule
when reinforcement is provided after a passage of time, but the time is not regular. like pop quizzes to make students be more prepared
what is a fixed ratio schedule
when reinforcement is provided after a certain number of responses have been made. like punch cards buy 10 get one free.
what is a variable ratio schedule
when reinforcement is provided after an unpredictable amount of responses. like at a casino.