Reinforcement and Schedules of Reinforcement Flashcards
The three main components of Operant Conditioning ABC stand for
A=antecedent cues
B=behaviour
C=consequences
In operant conditioning, the consequences of a behaviour influence two things:
1) the frequency of the behaviour in the future
2) the ability of future antecedent cues to set the occasion for the behaviour
What are the prime movers of operant conditioning?
the consequences
What are reinforcers?
consequences of the behaviour
To cause a behaviour (B) to be repeated in the future, a reinforcer must be?
immediately continent (dependent) on the execution of the behaviour
What is positive reinforcement
an increase in the future frequency of a behaviour to the addition of a new non-aversive (pleasant) event or stimulus e.g. child getting dessert after eating veggies
What is negative reinforcement
an increase in the future frequency of a behaviour when the consequence is the removal of an aversive (unpleasant stimulus) e.g. taking a panadol to get rid of headache
What is a reinforcement schedule?
A rule that states under what conditions a reinforcer will be delivered
What is the cumulative recorder
A classic device that provides an easily read, graphic depiction of changes in the organisms rate of response over time - roll of paper that unravels at a slow constant pace, and a moveable pen that makes tracks across it
the steeper the line the what of the response
the higher
What is a fixed ratio
constant - responding exhibits a ‘stop-and-go’ pattern (a high rate of responding along with a short pause following the attainment of each reinforcer)
What is a variable ratio
the number of required responses is not constant from reinforcer to reinforcer - on average, a subject will receiver one reinforcer for every n responses, but the exact number of responses required at any moment may vary
What is a fixed interval schedule
the first response after a fixed amount of time has elapsed is reinforced - after a fixed period of time has elapsed a reinforcer is “stored”, and the next response will produce the reinforcer - post reinforcement pause e.g. not looking when waiting for bus
What is a variable interval schedule
the amount of time that must pass before a reinforcer is stored varies unpredictably from reinforcer to reinforcer - often produces a steady, moderate response rate often with little or no post reinforcement pause
Is extinction more rapid after continuous reinforcement or after a schedule of intermittent/partial reinforcement?
continuous reinforcement