31.07.2015 Views

Steven Pinker -- How the Mind Works - Hampshire High Italian ...

Steven Pinker -- How the Mind Works - Hampshire High Italian ...

Steven Pinker -- How the Mind Works - Hampshire High Italian ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Thinking Machines 125ity to add 1 to a number bestows <strong>the</strong> ability to generate an infinite set ofnumbers, <strong>the</strong> ability to embed a proposition inside ano<strong>the</strong>r propositionbestows <strong>the</strong> ability to think an infinite number of thoughts.To get propositions-inside-propositions out of <strong>the</strong> network displayedin <strong>the</strong> preceding diagram, one could add a new layer of connections to<strong>the</strong> top of <strong>the</strong> diagram, connecting <strong>the</strong> bank of units for <strong>the</strong> whole propositionto <strong>the</strong> role slot in some bigger proposition; <strong>the</strong> role might be somethinglike "event-observed." If we continue to add enough layers, wecould accommodate an entire multiply nested proposition by etching afull tree diagram for it in connectoplasm. But this solution is clumsy andraises suspicions. For every kind of recursive structure, <strong>the</strong>re would haveto be a different network hard-wired in: one network for a person thinkingabout a proposition, ano<strong>the</strong>r for a person thinking about a propositionabout a person thinking about a proposition, a third for a person communicatinga proposition about some person to ano<strong>the</strong>r person, and so on.> In computer science and psycholinguistics, a more powerful and flexiblemechanism is used. Each simple structure (for a person, an action, aproposition, and so on) is represented in long-term memory once, and aprocessor shuttles its attention from one structure to ano<strong>the</strong>r, storing <strong>the</strong>\ itinerary of visits in short-term memory to thread <strong>the</strong> propositionV» toge<strong>the</strong>r. This dynamic processor, called a recursive transition network,is especially plausible for sentence understanding, because we hear andread words one at a time ra<strong>the</strong>r than inhaling an entire sentence at once.We also seem to chew our eomplex thoughts piece by piece ra<strong>the</strong>r swallowingor regurgitating <strong>the</strong>m whole, and that suggests that <strong>the</strong> mind isequipped with a recursive proposition-cruncher for thoughts, not just forsentences. The psychologists Michael Jordan and Jeff Elman have builtnetworks whose output units send out connections that loop back into aset of short-term memory units, triggering a-new cycle of activation flow.That looping design provides a glimpse of how iterative information processingmight be implemented in neural networks, but it is not enoughto interpret or assemble structured propositions. More recently, <strong>the</strong>rehave been attempts to combine a looping network with a propositionalnetwork to implement a kind of recursive transition network out ofpieces of connectoplasm. These attempts show that unless neural networksare specially assembled into a recursive processor, <strong>the</strong>y cannothandle our recursive thoughts.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!