12.12.2012 Views

Educational Psychology—Limitations and Possibilities

Educational Psychology—Limitations and Possibilities

Educational Psychology—Limitations and Possibilities

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

202 The Praeger H<strong>and</strong>book of Education <strong>and</strong> Psychology<br />

spent five postdoctoral years working in William J. Crozier’s laboratory. Crozier, who was an<br />

experimental biologist, had a major influence on Skinner’s philosophy <strong>and</strong> behavioristic position.<br />

Crozier, in contrast to psychologists who focused on studying the processes going on inside an<br />

organism, passionately believed in studying the behavior of an organism as a whole. This was<br />

the philosophy that paralleled Skinner’s goal of relating an organism’s behavior to experimental<br />

conditions.<br />

In 1936, Skinner joined the faculty of the University of Minnesota. Skinner’s tenure at the<br />

University of Minnesota can be characterized as remarkably productive wherein he was heavily<br />

engaged in scientific inquiry yet found the time to write a novel entitled Walden Two. Skinner<br />

stayed at the University of Minnesota for nine years, had a two-year stay at Indiana University as<br />

Chair of Psychology, <strong>and</strong> eventually returned to Harvard, where he remained for the rest of his life.<br />

SKINNER’S DEVELOPMENT OF OPERANT CONDITIONING<br />

Skinner remained consistent in his philosophy that the organism must literally operate upon<br />

its environment. This is in total contrast to Pavlovian conditioning, where the organism plays a<br />

very passive role. Furthermore, Skinner believed that antecedent events need to be considered<br />

when studying an organism’s behavior <strong>and</strong> that an organism’s behavior can be controlled by<br />

systematically manipulating the environment in which the organism is operating. These comprise<br />

the foundation of B.F. Skinner’s operant conditioning theory.<br />

As the organism operates in its environment, it encounters a unique type of stimulus that<br />

increases the organism’s response. In operant conditioning theory, a stimulus that increases the<br />

likelihood of the organism’s response is called a reinforcement or a reinforcer. Hulse et al. (1980)<br />

formally defined a reinforcer as a “stimulus event which, if it occurs in the proper temporal<br />

relation with a response, tends to maintain or to increase the strength of a response or of a<br />

stimulus-response connection” (p. 18). In contrast to a reinforcing stimulus or a reinforcer, an<br />

organism operating in its environment can also be exposed to unique types of stimuli that decrease<br />

the organism’s response. A stimulus that decreases the likelihood of the organism’s response is<br />

referred to as aversive stimuli.<br />

It is worth noting that operant conditioning is also called instrumental conditioning because the<br />

organism plays an instrumental role in developing the stimulus–response connection. This can<br />

be best explained by thinking of an experiment involving a rat in a box. In the box, known as a<br />

Skinner box, is a lever that when depressed delivers a food pellet into the box. The rat is operating<br />

in its environment <strong>and</strong> accidentally depresses the lever. A food pellet is then delivered into the<br />

box. In time, the rat will vigorously depress the lever to get more food pellets. Let us now examine<br />

the experiment through the operant conditioning theory. The rat (the organism) is operating in the<br />

box (the environment) <strong>and</strong> accidentally depresses the lever (operant response) <strong>and</strong> receives a food<br />

pellet (reinforcing stimulus). The rat then depresses the lever vigorously (increase in response) to<br />

receive more food pellets (reinforcing stimulus). This stimulus–response connection is established<br />

over time <strong>and</strong> this series of stimulus–response connections is considered as behavior. One can<br />

then ask, what if the reinforcing stimulus (i.e., the food pellet) is no longer delivered? Over<br />

time, the rat will stop the lever-pressing response because the reinforcing stimulus is no longer<br />

available. It could be said that the behavior has been extinguished. In the operant conditioning<br />

theory, this phenomenon is called extinction.<br />

While engaged in heavy operant conditioning experimentation, Skinner ran low on food pellets<br />

so he had to reduce the number of food pellets that were given to the rats as reinforcement.<br />

Interestingly, even though the rats received less reinforcement, the operant behavior continued<br />

to be exhibited over a period of time. This led Skinner to the discovery of the schedule of<br />

reinforcement.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!