10.07.2015 Views

Hockenbury Discovering Psychology 5th txtbk

Hockenbury Discovering Psychology 5th txtbk

Hockenbury Discovering Psychology 5th txtbk

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Operant Conditioning201B. F. Skinner and the Search for “Order in Behavior”From the time he was a graduate student in psychology until his death, the famousAmerican psychologist B. F. Skinner searched for the “lawful processes” that wouldexplain “order in behavior” (Skinner, 1956, 1967). Skinner was a staunch behaviorist.Like John Watson, Skinner strongly believed that psychology should restrictitself to studying only phenomena that could be objectively measured and verified—outwardly observable behavior and environmental events.Skinner (1974) acknowledged the existence of what he called “internal factors,”such as thoughts, expectations, and perceptions (Moore, 2005a). However, Skinnerbelieved that internal thoughts, beliefs, emotions, or motives could not be used toexplain behavior. These fell into the category of “private events” that defy direct scientificobservation and should not be included in an objective, scientific explanationof behavior (Baum & Heath, 1992).Along with being influenced by Watson’s writings, Skinner greatly admiredIvan Pavlov’s work. Prominently displayed in Skinner’s university office wasone of his most prized possessions—an autographed photo of Pavlov (Catania& Laties, 1999). Skinner acknowledged that Pavlov’s classical conditioningcould explain the learned association of stimuli in certain reflexive responses(Iversen, 1992). But classical conditioning was limited to existing behaviorsthat were reflexively elicited. Skinner (1979) was convinced that he had “founda process of conditioning that was different from Pavlov’s and much morelike most learning in daily life.” To Skinner, the most important form of learningwas demonstrated by new behaviors that were actively emitted by the organism,such as the active behaviors produced by Thorndike’s cats in trying toescape the puzzle boxes.Skinner (1953) coined the term operant to describe any “active behavior thatoperates upon the environment to generate consequences.” In everyday language,Skinner’s principles of operant conditioning explain how we acquire the wide rangeof voluntary behaviors that we perform in daily life. But as a behaviorist whorejected mentalistic explanations, Skinner avoided the term voluntary because itwould imply that behavior was due to a conscious choice or intention.Skinner defined operant conditioning concepts in very objective terms and heavoided explanations based on subjective mental states (Moore, 2005b). We’llclosely follow Skinner’s original terminology and definitions.Burrhus Frederick Skinner (1904–1990)As a young adult, Skinner had hoped tobecome a writer. When he graduated fromcollege, he set up a study in the attic of hisparents’ home and waited for inspirationto strike. After a year of “frittering” awayhis time, he decided that there were betterways to learn about human nature (Moore,2005a). As Skinner (1967) later wrote,“A writer might portray human behavioraccurately, but he did not understand it. Iwas to remain interested in human behavior,but the literary method had failed me;I would turn to the scientific. . . . The relevantscience appeared to be psychology,though I had only the vaguest idea ofwhat that meant.”ReinforcementIncreasing Future BehaviorIn a nutshell, Skinner’s operant conditioning explains learning as a process inwhich behavior is shaped and maintained by its consequences. One possible consequenceof a behavior is reinforcement. Reinforcement is said to occur when astimulus or an event follows an operant and increases the likelihood of the operantbeing repeated. Notice that reinforcement is defined by the effect it produces—increasing or strengthening the occurrence of a behavior in the future.Let’s look at reinforcement in action. Suppose you put your money into a softdrinkvending machine and push the button. Nothing happens. You push the buttonagain. Nothing. You try the coin-return lever. Still nothing. Frustrated, you slam themachine with your hand. Yes! Your can of soda rolls down the chute. In the future,if another vending machine swallows your money without giving you what youwant, what are you likely to do? Hit the machine, right?In this example, slamming the vending machine with your hand is the operant—the active response you emitted. The soft drink is the reinforcing stimulus, orreinforcer—the stimulus or event that is sought in a particular situation. In everydaylanguage, a reinforcing stimulus is typically something desirable, satisfying, orpleasant. Skinner, of course, avoided such terms because they reflected subjectiveemotional states.law of effectLearning principle, proposed by Thorndike,that responses followed by a satisfyingeffect become strengthened and are morelikely to recur in a particular situation, whileresponses followed by a dissatisfying effectare weakened and less likely to recur in aparticular situation.operantSkinner’s term for an actively emitted(or voluntary) behavior that operates on theenvironment to produce consequences.operant conditioningThe basic learning process that involveschanging the probability that a responsewill be repeated by manipulating theconsequences of that response.reinforcementThe occurrence of a stimulus or event followinga response that increases the likelihoodof that response being repeated.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!