Watching this resources will notify you when proposed changes or new versions are created so you can keep track of improvements that have been made.
Favoriting this resource allows you to save it in the “My Resources” tab of your account. There, you can easily access this resource later when you’re ready to customize it or assign it to your students.
Skinner theorized that if a behavior is followed by reinforcement, that behavior is more likely to be repeated, but if it is followed by punishment, it is less likely to be repeated.
Skinner conducted his research on rats and pigeons by presenting them with positive reinforcement, negative reinforcement, or punishment in various schedules that were designed to produce or inhibit specific target behaviors.
Skinner did not include room in his research for ideas such as free will or individual choice; instead, he posited that all behavior could be explained using learned, physical aspects of the world, including life history and evolution.
Operant conditioning is a theory of behaviorism that focuses on changes in an individual's observable behaviors. In operant conditioning, new or continued behaviors are impacted by new or continued consequences. Research regarding this principle of learning was first conducted by Edward L. Thorndike in the late 1800s, then brought to popularity by B. F. Skinner in the mid-1900s. Much of this research informs current practices in human behavior and interaction.
Skinner's Theories of Operant Conditioning
Almost half a century after Thorndike's first publication of the principles of operant conditioning and the law of effect, Skinner attempted to prove an extension to this theory—that all behaviors are in some way a result of operant conditioning. Skinner theorized that if a behavior is followed by reinforcement, that behavior is more likely to be repeated, but if it is followed by some sort of aversivestimuli or punishment, it is less likely to be repeated. He also believed that this learned association could end, or become extinct, if the reinforcement or punishment was removed.
Skinner's most famous research studies were simple reinforcement experiments conducted on lab rats and domestic pigeons, which demonstrated the most basic principles of operant conditioning. He conducted most of his research in a special cumulative recorder, now referred to as a "Skinner box," which was used to analyze the behavioral responses of his test subjects. In these boxes he would present his subjects with positive reinforcement, negative reinforcement, or aversive stimuli in various timing intervals (or "schedules") that were designed to produce or inhibit specific target behaviors.
In his first work with rats, Skinner would place the rats in a Skinner box with a lever attached to a feeding tube. Whenever a rat pressed the lever, food would be released. After the experience of multiple trials, the rats learned the association between the lever and food and began to spend more of their time in the box procuring food than performing any other action. It was through this early work that Skinner started to understand the effects of behavioral contingencies on actions. He discovered that the rate of response—as well as changes in response features—depended on what occurred after the behavior was performed, not before. Skinner named these actions operant behaviors because they operated on the environment to produce an outcome. The process by which one could arrange the contingencies of reinforcement responsible for producing a certain behavior then came to be called operant conditioning.
To prove his idea that behaviorism was responsible for all actions, he later created a "superstitious pigeon." He fed the pigeon on continuous intervals (every 15 seconds) and observed the pigeon's behavior. He found that the pigeon's actions would change depending on what it had been doing in the moments before the food was dispensed, regardless of the fact that those actions had nothing to do with the dispensing of food. In this way, he discerned that the pigeon had fabricated a causal relationship between its actions and the presentation of reward. It was this development of "superstition" that led Skinner to believe all behavior could be explained as a learned reaction to specific consequences.
In his operant conditioning experiments, Skinner often used an approach called shaping. Instead of rewarding only the target, or desired, behavior, the process of shaping involves the reinforcement of successive approximations of the target behavior. Behavioral approximations are behaviors that, over time, grow increasingly closer to the actual desired response.
Skinner believed that all behavior is predetermined by past and present events in the objective world. He did not include room in his research for ideas such as free will or individual choice; instead, he posited that all behavior could be explained using learned, physical aspects of the world, including life history and evolution. His work remains extremely influential in the fields of psychology, behaviorism, and education.
Source: Boundless. “Basic Principles of Operant Conditioning: Skinner.” Boundless Psychology. Boundless, 27 Aug. 2015. Retrieved 29 Aug. 2015 from https://www.boundless.com/psychology/textbooks/boundless-psychology-textbook/learning-7/operant-conditioning-47/basic-principles-of-operant-conditioning-skinner-197-12732/