31.07.2015 Views

Steven Pinker -- How the Mind Works - Hampshire High Italian ...

Steven Pinker -- How the Mind Works - Hampshire High Italian ...

Steven Pinker -- How the Mind Works - Hampshire High Italian ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

108 J HOW THE MIND WORKSyoung, and so on. Any fact stored in <strong>the</strong> connections for one animal (parrotsare warm-blooded) automatically transfers to similar animals (budgiesare warm-blooded), because <strong>the</strong> network does not care that <strong>the</strong>connections belong to an animal at all. The connections merely saywhich visible properties predict which invisible properties, skippingideas about species of animals altoge<strong>the</strong>r.Conceptually speaking, a pattern associator captures <strong>the</strong> idea that iftwo objects are similar in some ways, <strong>the</strong>y are probably similar in o<strong>the</strong>rways. Mechanically speaking, similar objects are represented by some of<strong>the</strong> very same units, so any piece of information connected to <strong>the</strong> unitsfor one object will ipso facto be connected to many of <strong>the</strong> units for <strong>the</strong>o<strong>the</strong>r. Moreover, classes of different degrees of inclusiveness are superimposedin <strong>the</strong> same network, because any subset of <strong>the</strong> units implicitlydefines a class. The fewer <strong>the</strong> units, <strong>the</strong> larger <strong>the</strong> class. Say <strong>the</strong>re areinput units for "moves," "brea<strong>the</strong>s," "hairy," "barks," "bites," and "lifts-legat-hydrants."The connections emanating out of all six trigger facts aboutdogs. The connections emanating out of <strong>the</strong> first three trigger facts aboutmammals. The connections emanating out of <strong>the</strong> first two trigger factsabout animals. With suitable weights, <strong>the</strong> knowledge programmed in forone animal can be shared with both its immediate and its distant familymembers.A fifth trick of neural networks is that <strong>the</strong>y learn from examples,where learning consists of changes in <strong>the</strong> connection weights. Themodel-builder (or evolution) does not have to hand-set <strong>the</strong> thousands ofweights needed to get <strong>the</strong> outputs right. Suppose a "teacher" feeds a patternassociator with an input and also with <strong>the</strong> correct output. A learningmechanism compares <strong>the</strong> network's actual output—which at first will bepretty random—with <strong>the</strong> correct one, and adjusts <strong>the</strong> weights to minimize<strong>the</strong> difference between <strong>the</strong> two. If <strong>the</strong> network leaves an outputnode off that <strong>the</strong> teacher says ought to be on, we want to make it morelikely that <strong>the</strong> current funnel of active inputs will turn it on in <strong>the</strong> future.So <strong>the</strong> weights on <strong>the</strong> active inputs to <strong>the</strong> recalcitrant output unit areincreased slightly. In addition, <strong>the</strong> output node's own threshold is loweredslightly, to make it more trigger-happy across <strong>the</strong> board. If <strong>the</strong> networkturns an output node on and <strong>the</strong> teacher says it should be off, <strong>the</strong>opposite happens: <strong>the</strong> weights of <strong>the</strong> currently active input lines aretaken down a notch (possibly driving <strong>the</strong> weight past zero to a negativevalue), and <strong>the</strong> target node's threshold is raised. This all makes <strong>the</strong>hyperactive output node more likely to turn off in response to those

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!