Principles of Operant Conditioning Flashcards
While classical conditioning is useful for understanding how certain stimuli ____ ____ ___ and other ____ ____ ____, Operant Conditioning identifies the factors responsible for the ____ and ____ of ____ ____ ____. For example, operant conditioning explains how a child learns to ride a bicycle and why a student is willing to spend many hours studying for an important exam.
Automatically Evoke Reflexes; Relatively Simple Responses; Aquisition and Maintenance of Complex Voluntary Bheviors
The basic principles of operant conditioning were first described by ________ and were subsequently expanded upon by ________.
Edward Thorndike ; B.F. Skinner.
____ believed that the study of learning in lower animals would reveal important information about human learning; his best known studies involved ____ ____ ____ in “____ ____” that required them to make a ____ ____ (e.g., pulling a loop of string) to escape from the box and obtain food.
Thorndike; Placing Hungry Cats in “Puzzle Boxes”; Particular Response
Thorndike noticed that during early trails, the cats engaged in ____ ____ ____ before making the response that ____ ____. However, as the number of trials increased the cats made the correct response ____ and ____ after being placed in the box.
Numerous Unproductive Activities; Permitted Escape; Sooner and Sooner
Thorndike was particularly interested in the ____, ____-and-____nature of the cats’ learning, and he noted that the animals did not display behavior suggesting that they suddenly “____” the ____. He concluded that learning is not due to mental events, or thinking about a problem, but, instead, to ____, or the ____ that ____ ____ ____ and ____ as the ____ of ____-and-____.
Slow, Trial-and-Error; “Understood” the Problem; Connectionism; Connections that Develop Between Responses and Stimuli as the Result of Trial-and-Error
Because the behaviors he studied were instrumental in helping the animals achieve a goal, Thorndike referred to this phenomenon as _________.
instrumental learning
Thorndike developed several basic laws of learning, the most important of which is the ____ of ____. According to the original version of this law, any response that is followed by “a ____ ____ of ____” is likely to be ____, while any act that results in an “____ ____ of ____” is less likely to recur.
Law of Effect; “A Satisfying State of Affairs”; Repeated; “Annoying State of Affairs”
Thorndike later eliminated the second part of his law of effect based on subsequent research which suggested that while positive consequences increase behavior, negative ones often have ____ or ____ ____.
Little or No Effect.
____ considered Pavlov’s model of classical conditioning adequate for explaining the acquisition of respondent behaviors that are automatically elicited by certain stimuli. However, he believed that most ____ ____ are voluntarily emitted or not emitted as the result of the way they “____” on the environment (i.e., as the result of the consequences that follow them), and he referred to this type of learning as ____ ____.
Skinner; Complex Behaviors; “Operate”; Operant Conditioning
____ and ____: According to Skinner, the environment provides organisms with a variety of ____ and ____ ____ that cause them to either ____ or ____ the ____ that preceded them. Skinner referred to the consequences as ____ and ____he distinguished between ____ and ____ ____ and ____ and ____ ____.
Reinforcement and Punishment; Positive and Negative Consequences; Display or Withhold the Behaviors; Reinforcement and Punishment; Reinforcement and Punishment; Positive and Negative Reinforcement; Positive and Negative Punishment
The terms “positive” and “negative” (as used by Skinner) are not synonymous for good and bad or pleasant and unpleasant. Instead, ____ refers to the application of a stimulus, while ____ means withholding or removing a stimulus. Stimulus Applied
Positive; Negative
By definition, ____ increases the behavior it follows. With ____ ____, performance of a behavior increases as the result of the application of a stimulus (reinforcer) following the behavior.
Reinforcement; Positive Reinforcement
In Thorndike’s experiment, a cat’s “pulling-on-the-sting” behavior increased because it led too the attainment of food. With __________, a behavior increases as the result of the withdrawal or termination of a stimulus (reinforcer) following the behavior. When pressing a lever stops and electric shock, lever-pressing increases because it is being negatively reinforced.
Negative Reinforcement
____ can also be either positive or negative but, unlike reinforcement, punishment decreases the behavior it follows.
Punishment
__________ occurs when the application of a stimulus following a response decreases that response. Slapping a dog with a rolled-up newspaper after he chews your favorite shoes to stop the dog’s chewing behavior is an example of positive punishment.
Positive Punishment
__________ occurs when removal or termination of a stimulus following a behavior decreases that behavior. Taking away a child’s allowance whenever they act aggressively toward their younger siblings to decrease their aggression is an example of negative punishment.
Negative Punishment
For the sake of convenience, Skinner usually looked at such behaviors as bar-pressing and key-pecking in rats and pigeons within the confines of box-like “____ ____,” which was sometimes referred to as a “____ ____.” In a typical positive reinforcement experiment, water or food was delivered into the box via a ____ ____ whenever the animal pressed the bar or pecked the key.
Operant Chamber; Skinner Box; Delivery Tube
Skinner evaluated the effectiveness of operant conditioning by measuring ____ ____, which entailed determining a) the ____ of ____ during acquisition trials and/or b) the total number of ____ made during ____ ____ (the period when no reinforcement is provided). During his experiments, information on operant strength was recorded using a ____ ____, which provides a graphic representation of the total number of responses made over time.
Operant Strength; Rate of Responding; Responses; Extinction Trials; Cumulative Recorder
____ always involves an increase in behavior while ____ always involves a decrease in behavior and positive means “____” while negative means “____.” If you encounter a vignette-type question on the exam that requires you to identify whether the situation described is an example of positive or negative reinforcement or punishment, first identify the target behavior and determine if that behavior is ____ or ____ ____ to ____ - which will indicate if the behavior is being ____ or ____. Then determine if the ____ following the behavior is being ____ or ____ - which will indicate if the reinforcement or punishment is ____ or ____.
Reinforcement; Punishment; Apply; Withdraw; More or Less Likely to Occur; Reinforced or Punished; Stimulus; Applied or Withdrawn; Positive or Negative
____ _____ occurs when reinforcement is consistently withheld from a previously reinforced behavior to decrease or eliminate that behavior.
Operant Extinction
Experiments on operant extinction have demonstrated that withdrawal of a reinforcer does not usually cause an ____ ____ of the ____. Instead, the response disappears ____ after an initial phase in which responding is more ____ and ____.
Immediate Cessation of the Response; Gradually; Variable and Forceful
If a rat has been reinforced for bar-pressing, sudden withdrawal of reinforcement will initially cause the rat to bar-press more than usual before bar-pressing begins to decline. This temporary increase in responding during extinction trials is called an ____ (____) ____.
Extinction (Response) Burst