31.07.2015 Views

Steven Pinker -- How the Mind Works - Hampshire High Italian ...

Steven Pinker -- How the Mind Works - Hampshire High Italian ...

Steven Pinker -- How the Mind Works - Hampshire High Italian ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Thinking Machines 109inputs in <strong>the</strong> future. A whole series of inputs and <strong>the</strong>ir outputs is presentedto <strong>the</strong> network, over and over, causing waves of little adjustmentsof <strong>the</strong> connection weights, until it gets every output right for every input,at least as well as it can manage to.A pattern associator equipped with this learning technique is called aperceptron. Perceptrons are interesting but have a big flaw. They are like<strong>the</strong> chef from hell: <strong>the</strong>y think that if a little of each ingredient is good, alot of everything must be better. In deciding whe<strong>the</strong>r a set of inputs justifiesturning on an output, <strong>the</strong> perceptron weights <strong>the</strong>m and adds <strong>the</strong>mup. Often that gives <strong>the</strong> wrong answer, even on very simple problems. Atextbook example of this flaw is <strong>the</strong> perceptrons handling of <strong>the</strong> simplelogical operation called exclusive-or ("xor"), which means "A or B, butnot both."When A is on, <strong>the</strong> network should turn A-xor-B on. When B is on, <strong>the</strong>network should turn A-xor-B on. These facts will coax <strong>the</strong> network intoincreasing <strong>the</strong> weight for <strong>the</strong> connection from A (say, to .6) and increasing<strong>the</strong> weight for <strong>the</strong> connection from B (say, to .6), making each onehigh enough to overcome <strong>the</strong> output unit's threshold (say, .5). But whenA and B are both on, we have too much of a good thing—A-xor-B isscreaming its head off just when we want it to shut up. If we try smallerweights or a higher threshold, we can keep it quiet when A and B areboth on, but <strong>the</strong>n, unfortunately, it will be quiet when just A or just B ison. You can experiment with your own weights and you will see thatnothing works. Exclusive-or is just one of many demons that cannot bebuilt out of perceptrons; o<strong>the</strong>rs include demons to determine whe<strong>the</strong>r aneven or an odd number of units are on, to determine whe<strong>the</strong>r a string ofactive units is symmetrical, and to get <strong>the</strong> answer to a simple additionproblem.The solution is to make <strong>the</strong> network less of a stimulus-response creatureand give it an internal representation between <strong>the</strong> input and outputlayers. It needs a representation that makes <strong>the</strong> crucial kinds of informationabout <strong>the</strong> inputs explicit, so that each output unit really can just add

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!