Introduction to B.F. Skinner
B.F. Skinner is responsible for expanding the field of behaviorism after the early work of E.L. Thorndike, and his law of effect . Skinner divided behaviorism into respondent conditioning and operant conditioning, the latter of which he defined as explaining how the consequence of a behavior controlled the future occurrence of that same behavior. He believed all behavior could be explained by an action performed and the valence of its consequence. Skinner's most famous research studies were simple reinforcement experiments conducted on lab rats and domestic pigeons, which demonstrated the most basic principles of operant conditioning. His work remains extremely influential in the worlds of psychology, behaviorism, and education.
Skinner conducted most of his research in a special cumulative recorder, now referred to as a Skinner box, which was used to analyze behavioral responses from his test subjects. In his first work with rats, he discovered that the rate of response and changes in response features depended on what occurred after the behavior was performed, not before. Skinner named these actions operant behaviors because they operated on the environment to produce an outcome. The process by which one could arrange the contingencies of reinforcement responsible for producing a certain behavior then came to be called operant conditioning.
Rats and Pigeons
Skinner completed his early research on rats and his later research on pigeons, as he found the latter to provide more extensive and more rapid feedback. While working with rats, he would place them in a Skinner box with a lever attached to a feeding tube. Whenever the rat pressed the lever, food would be released. After the experience of multiple trials, rats learned the association between the lever and food, and began to spend more of their time in the box procuring food than performing any other action. It was through this early work that Skinner started to understand the contingencies of behaviorism on actions.
After a short assignment working with pigeons for the military during WWII, Skinner transitioned all of his research to include these birds rather than rats. He found that pigeons responded to stimuli quicker, and allowed his research to progress faster. It was not until he made this transition that his belief in behaviorism as the dictator of all action was solidified.
To prove this idea, he created a superstitious pigeon. He fed the pigeon on continuous intervals (every 15 seconds), and observed the pigeon's behavior. He found that a pigeon's actions would change depending on what they had been doing in the moments before the food was dispensed, regardless of the fact that those actions had nothing to do with the dispensing of food. In this way, he discerned that the pigeon had fabricated a causal relationship between its actions and the presentation of reward. It was this development of superstition that led Skinner to believe all behavior could be explained as a learned reaction to specific consequences.
In order to study and manipulate his test subjects, Skinner used specific principles and schedules of reinforcement. Positive reinforcement was the presence of a desired reward after the performing of a desired behavior. Negative reinforcement was the removal of an undesirable event after the subject performed the desired behavior. Each encouraged behavior in a different but highly-effective fashion. Punishment was the opposite of reinforcement: when the subject performed an undesired behavior, it was administered a negative or aversive consequence. Although negative reinforcement and punishment seem similar, punishment decreases behavior whereas negative reinforcement increases it.
Skinner also used various types of reinforcement schedules to test different theories of operant conditioning. There were four main reinforcement schedules: fixed ratio, fixed interval, variable ratio, and variable interval. A fixed ratio schedule involved waiting for a set number of responses to occur, then reinforcing behavior on the next appropriate response. A fixed interval schedule involved allowing a set time to pass, then reinforcing the behavior at the next appropriate time. A variable ratio schedule involved providing reinforcement when a response occurred after a variable number of times. A variable interval schedule involved providing reinforcement when a response occurred after a variable amount of time had passed. Based on this research, Skinner found that variable reinforcement schedules (both fixed and interval) were the most effective in maintaining behavior.
Skinner believed that all behavior is predetermined by past and present events in the objective world. He did not include room in his research for ideas such as free will or individual choice. Instead, he posited that all of behavior could be reasoned from learned, physical aspects of the world, including life history and evolution.