Skinner’s Box Experiment

We receive rewards and punishments for many behaviors. More importantly, once we experience that reward or punishment, we are likely to perform (or not perform) that behavior again in anticipation of the result. 

Psychologists in the late 1800s and early 1900s had a hunch that rewards and punishments were crucial to shaping and encouraging voluntary behavior. But they needed a way to test it. And, they needed a name for the process in which rewards and punishments shaped voluntary behaviors. Along came Burrhus Frederic Skinner, and the rest is history.

BF Skinner

Who is B.F. Skinner?

Burrhus Frederic Skinner, also known as B.F. Skinner, is considered the “father of Operant Conditioning.” His experiments, conducted in what is known as “Skinner’s box,” are some of the most well-known experiments in psychology. They helped shape the ideas of operant conditioning in behaviorism.

Law of Effect (Thorndike vs. Skinner) 

At the time, classical conditioning was the top theory in behaviorism. But Skinner knew that research showed that voluntary behaviors could be part of the conditioning process as well. In the late 1800s, a psychologist named Edward Thorndike wrote about “The Law of Effect.” He said, “responses that produce a satisfying effect in a particular situation become more likely to occur again in that situation, and responses that produce a discomforting effect become less likely to occur again in that situation.”

Thorndike tested out The Law of Effect with a box of his own. The box contained a maze and a lever. He placed a cat inside the box and fish outside the box. He then recorded the times in which it took the cats to get out of the box and eat the fish. 

Thorndike noticed that the cats would explore the maze and eventually found the lever. The level would let them out of the box (and lead them to the fish) faster. Once discovering this, the cats were more likely to use the lever when they wanted to get fish. 

What Is Skinner’s Box?

Skinner not only used Skinner box experiments to show the existence of operant conditioning, but he also showed schedules in which operant conditioning was more or less effective, depending on your goals. And that is why he is called The Father of Operant Conditioning.Skinner's Box Example

 

Inspired by Thorndike, Skinner created a box of his own to test his theory of Operant Conditioning. (This box is also known as an “operant conditioning chamber.”) Inside, he would place rats of pigeons. But Skinner took his research beyond what Thorndike did. He looked at how reinforcements and schedules of reinforcement would influence behavior. 

Reinforcements

Reinforcements are the rewards that satisfy your needs. The fish that cats received outside of Thorndike’s box was a reinforcement. In Skinner box experiments, pigeons or rats also received food. 

Skinner would place the rats in a Skinner box with neutral stimulants (that produced neither reinforcement or punishment) and a lever that would dispense food. As the rats started to explore the box, they would stumble upon the level, activate it, and get food. Skinner observed that they were likely to engage in this behavior again, anticipating food.

Negative Reinforcements 

But Skinner also looked at negative reinforcements. In some experiments, he would send an electric current through the box that would shock the rats. If the rats pushed the lever, the shocks would stop. Skinner saw that the rats quickly learned to turn off the shocks by pushing the lever. 

Schedules of Reinforcement 

Operant Conditioning Example

 

We know that not every behavior has the same exact reinforcement, every single time. Let’s go back to the example I used earlier about Uber drivers. You may have a string of customers that tip you generously after you make conversation with them. At this point, you’re likely to make conversation with the next passenger, right? But what happens if they don’t tip you after you have a conversation with them? What happens if you stay silent for one ride and get a big tip? 

Psychologists weren’t just asking how quickly someone makes a behavior a habit after receiving reinforcement. (Aka, how many tips will it take for you to make conversation with passengers every time?) They also wanted to know, how fast will you stop making conversation with passengers if you stop getting tips?

Skinner attempted to answer these questions by looking at different schedules of reinforcement. He would offer positive reinforcements on different schedules, like offering it every single time a behavior was performed (continuous reinforcement) or at random (variable ratio reinforcement.) Based on his experiments, he would measure the following:

  • Response rate (how quickly the behavior was performed)

  • Extinction rate (how quickly the behavior would stop) 

Here’s what he found. 

Continuous reinforcement: If you reinforce a behavior every single time, the response rate is medium and the extinction rate is fast. The behavior will be performed only when the reinforcement is needed. As soon as you stop reinforcing a behavior on this schedule, the behavior will not be performed.

Fixed-ratio reinforcement: Let’s say you reinforce the behavior every fourth or fifth time. The response rate is fast and the extinction rate is medium. The behavior will be performed quickly to reach the reinforcement. 

Fixed-interval reinforcement: In the above cases, the reinforcement was given immediately after the behavior was performed. But what if the reinforcement was given at a fixed interval, provided that the behavior was performed at some point within that interval? Skinner found that the response rate is medium and the extinction rate is medium. 

Variable-ratio reinforcement: Here’s where things start to get random. The behavior is reinforced, but at random times. (Gambling is a good example of this.) The response rate is fast but the extinction rate is slow. No wonder gambling is so addicting! 

Variable-interval reinforcement: Last but not least, let’s say the reinforcement is given out at random intervals, provided that the behavior is performed. It could be five minutes after or seven minutes after. Skinner found that the response rate for this schedule is fast and the extinction rate is slow. 

Examples of Operant Conditioning in Everyday Life

If you put your hand on a hot stove, you’re going to get burned. More importantly, you are very unlikely to put your hand on that hot stove again. 

If you make conversation with a passenger while driving for Uber, you might get an extra tip at the end of your ride. More importantly, you are very likely to keep making conversations with passengers as you drive for Uber. 

If your dog sits when you say “sit,” you might give him a treat. More importantly, they are very likely to sit when you say, “sit.” 

These examples can be explained by operant conditioning, which was developed thanks to Skinner’s box. 

Theodore T.

Theodore is a professional psychology educator with over 10 years of experience creating educational content on the internet. PracticalPsychology started as a helpful collection of psychological articles to help other students, which has expanded to a Youtube channel with over 2,000,000 subscribers and an online website with 500+ posts.