Operant/Instrumental Conditioning (Skinner) Flashcards
Operant conditioning
Refers to kind of learning in which the consequences that follow some behavior increase or decrease the likelihood of that behaviors occurrence in the future.
Operant
An instrumental response ( a rat pressing a lever).
Reinforcer
Something that increases the likelihood of the behavior.
Positive reinforcement
If desired behavior occurs, add something pleasant.
Negative reinforcement
If desired behavior occurs, take away something unpleasant.
Punisher
Something that decreases the likelihood of the behavior.
Learned helplessness
Occurs when a subject believes that unpleasant or painful stimuli are inevitable and gives up trying to change the circumstances.
Shaping
Reinforcing successive steps to reach a desired behavior.
Chaining
Reinforcing a series of behaviors to get a reward.
Extinction
Occurs if behavioral response is no longer reinforced.
Schedule of reinforcement
Pattern of reinforcing behavioral responses. There are two main types.
Continuous reinforcement
Reinforcement after every correct response.
Partial reinforcement
Reinforcement after some correct responses. Four main types.
Fixed
In conditioning, a schedule in which a reinforcer occurs only after a fixed number of responses from the subject.
Variable
Reinforcement is given an average amount of time after correct response (Low rates of response).
Fixed ratio schedules
Reinforcement is given after a fixed number of correct responses (high rate of responses).
A fixed ratio schedule is often used to pay assembly line workers because it results in fast rates of work: the more pieces of work completed, the more a worker is paid (reinforcement).
Variable ratio schedules
Reinforcement is given after an average number of correct responses (very high rates of responses (very high rate of response).
The variable ratio schedule produces a high rate of responding because of subject (pigeon or slot machine player) doesn’t know which response will finally produce a payoff.
Fixed interval schedule
Means that a reinforcer occurs following the first response that occurs after a fixed interval of time.
A fixed interval schedule results in slow responding at first, but as the time for the reinforcer draws near, the response rate greatly accelerates.
Variable interval schedule
Means that a reinforcer occurs following the first correct response after an average amount of time has passed.
A variable interval schedule results in a more regular rate of responding than does a fixed interval schedule.