29.08.2013 Views

Connectionist Modeling of Experience-based Effects in Sentence ...

Connectionist Modeling of Experience-based Effects in Sentence ...

Connectionist Modeling of Experience-based Effects in Sentence ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Chapter 4 Two SRN Prediction Studies<br />

mechanism and the context loop. As mentioned <strong>in</strong> the previous chapter, there are other<br />

learn<strong>in</strong>g mechanisms that can <strong>in</strong>crease the span; although they might be cognitively very<br />

unmotivated. Interest<strong>in</strong>gly, however, there is evidence that even human readers rely on<br />

local coherence <strong>in</strong> certa<strong>in</strong> structures (Tabor et al., 2004). Another f<strong>in</strong>d<strong>in</strong>g is that the<br />

simulations reported <strong>in</strong> Christiansen and Chater (1999) and also the comma issue <strong>in</strong> simulations<br />

3 and 4 presented here showed that the SRN handles count<strong>in</strong>g-recursion better<br />

than other types. That may be the reason for the strong facilitat<strong>in</strong>g effect <strong>of</strong> comma<br />

<strong>in</strong>sertion compared to head-f<strong>in</strong>ality. Address<strong>in</strong>g this, it shall be noted that Rodriguez<br />

(2001) claims that SRNs can <strong>in</strong> fact carry out explicit symbolic count<strong>in</strong>g procedures.<br />

This work argued for a uniform account to <strong>in</strong>dividual and language-specific differences<br />

as well as language-<strong>in</strong>dependent process<strong>in</strong>g skill. All three can <strong>in</strong> considerable parts be<br />

attributed to experience with the <strong>in</strong>dividual l<strong>in</strong>guistic environment <strong>in</strong> <strong>in</strong>teraction with<br />

architectural preconditions. It can be concluded that a lot <strong>of</strong> work is necessary before<br />

f<strong>in</strong>e-gra<strong>in</strong>ed experience-<strong>based</strong> predictions can be ga<strong>in</strong>ed for the highly complex task <strong>of</strong><br />

sentence comprehension. By all means, literature shows a promis<strong>in</strong>g trend towards PDP<br />

models <strong>of</strong> language comprehension, accompanied by the <strong>in</strong>tegration <strong>of</strong> corpus analyses<br />

and acquisition data.<br />

82

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!