Schedules of Reinforcement Flashcards
What is a fixed duration (FD) schedule?
A fixed duration (FD) schedule is where reinforcement is contingent on the performance of a behavior for some time. EX: A child receives a snack after practicing piano for 30 minutes (FD 30).
What is a fixed interval (FI) schedule?
A fixed interval (FI) schedule is where reinforcement is provided following a behavior that has occurred after a period of time.
EX: A pigeon pecks at a disk and receives reinforcement. After five seconds, the next peck is reinforced. Five seconds AND a peck must occur before reinforcement occurs (FI 5).
What is a fixed ratio (FR) schedule?
A fixed ratio (FR) schedule is where reinforcement occurs after the behavior has occurred a certain number of times.
EX: A pigeon receives food after 5 disk pecks (FR 5).
What is a fixed time (FT) schedule?
A fixed time (FT) schedule is where reinforcement is provided after a set amount of time, regardless of any behavior that is or is not occurring.
EX: A pigeon receives food every 5 seconds whether or not pecking is happening (FT 5).
What is noncontingent reinforcement?
Noncontingent reinforcement is where reinforcers are delivered independently of behavior.
What are the two types of noncontingent reinforcement?
The two types of reinforcement are fixed time (FT) and variable time (VT) schedules, because in both of these, reinforcement is delivered based on the time elapsed and nothing else.
What is the partial reinforcement effect (PRE)?
The partial reinforcement effect is the tendency of behavior that has been maintained on an intermittant schedule to be more resistant to extinction than behavior that was on continuous reinforcement.
What are the four hypothesis to explain the partial reinforcement effect?
The frustration hypothesis, the response unit hypothesis, the sequential hypothesis, and the discrimination hypothesis.
What is a variable duration (VD) schedule?
A variable duration (VD) schedule is one where behavior is reinforced after a certain amount of time averaging out to a specific time.
EX: A child is reinforced after playing piano for an average of 30 minutes, sometimes 15, 30, or 45 minutes (VD 30).
What is a variable interval (VI) schedule?
A variable interval (VI) schedule is where the length of time where performing is not reinforced varies around an average.
EX: A pigeon is reinforced after a disk peck around an average of 5 seconds, sometimes 1,2,3,4 or 5. (VI 5).
What is a variable ratio (VR) schedule?
A variable ratio (VR) schedule is wherein a behavior is reinforced around an average number of times.
EX: A pigeon is reinforced for an average of 5 disk pecks, and can vary from 1-5 pecks (VR 5).
What is a variable time (VT) schedule?
A variable time (VT) schedule is where reinforcement is delivered at irregular times that average out to a specific number, regardless of behavior.
EX: A pigeon is provided food after an average of 5 seconds has passed (anywhere from 1-5 seconds being passed) regardless of the behavior occurring at that time (VT 5).