29.08.2013 Views

Connectionist Modeling of Experience-based Effects in Sentence ...

Connectionist Modeling of Experience-based Effects in Sentence ...

Connectionist Modeling of Experience-based Effects in Sentence ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Chapter 3<br />

<strong>Connectionist</strong> Modell<strong>in</strong>g <strong>of</strong> Language<br />

Comprehension<br />

3.1 Structure and Learn<strong>in</strong>g<br />

<strong>Connectionist</strong> networks are prototypical exposure-<strong>based</strong> models. Or, more precisely,<br />

they are the implementation <strong>of</strong> non-committed exposure-<strong>based</strong> accounts. Non-committed<br />

should mean here accounts without any specific assumptions about structural levels or<br />

gra<strong>in</strong> sizes, nor about the l<strong>in</strong>k<strong>in</strong>g between corpus regularities and behavior. In literature<br />

there does not seem to be an agreement about what models to call connectionist. It<br />

must be mentioned that there are hybrid models that use parallel distributed activation<br />

spread<strong>in</strong>g between symbolic entities on the one hand (e.g. Just and Carpenter, 1992;<br />

Lewis and Vasishth, 2005), and there are connectionist models that use hand-designed<br />

architectures and local representations on the other hand (e.g. Dell et al., 2002; McClelland<br />

and Elman, 1984; Rohde, 2002). I am concerned here only with “fully connectionist<br />

models” us<strong>in</strong>g fully distributed representations and no pre-designed <strong>in</strong>ternal structur<strong>in</strong>g.<br />

The most important feature that dist<strong>in</strong>guishes a connectionist network model <strong>of</strong> that<br />

k<strong>in</strong>d from symbolic models is its architecturally constra<strong>in</strong>ed highly adaptive learn<strong>in</strong>g<br />

ability. <strong>Connectionist</strong> models are functional problem solv<strong>in</strong>g mach<strong>in</strong>es that, depend<strong>in</strong>g<br />

on the specific learn<strong>in</strong>g algorithm and certa<strong>in</strong> architectural properties, are able to f<strong>in</strong>d<br />

the optimal solution to any task representable as <strong>in</strong>put-output pairs. The design <strong>of</strong><br />

symbolic models mostly <strong>in</strong>volves many assumptions about the desired processes which<br />

are hard-coded <strong>in</strong>to the system. For example, it has to be specified how to categorize<br />

and represent the <strong>in</strong>put. A connectionist system on the other hand starts from zero<br />

without any presumptions. The structure <strong>of</strong> the <strong>in</strong>ternal <strong>in</strong>put representation is shaped<br />

dur<strong>in</strong>g the learn<strong>in</strong>g process depend<strong>in</strong>g on the task requirements. Obviously the <strong>in</strong>formation<br />

about the structure that l<strong>in</strong>guists annotate to word str<strong>in</strong>gs <strong>of</strong> natural language<br />

is already there <strong>in</strong> the pla<strong>in</strong> str<strong>in</strong>gs. Extract<strong>in</strong>g the underly<strong>in</strong>g structure <strong>of</strong> the <strong>in</strong>put<br />

requires <strong>in</strong>formation about sequential and temporal relations between <strong>in</strong>put chunks. For<br />

that reason time is an important component <strong>of</strong> cognitive tasks. In particular language<br />

transports highly-structured <strong>in</strong>formation while be<strong>in</strong>g entirely sequential. A memory <strong>of</strong><br />

earlier <strong>in</strong>put and the representation <strong>of</strong> temporal relations between <strong>in</strong>put chunks pro-<br />

47

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!