29.08.2013 Views

Connectionist Modeling of Experience-based Effects in Sentence ...

Connectionist Modeling of Experience-based Effects in Sentence ...

Connectionist Modeling of Experience-based Effects in Sentence ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Chapter 1 Prelim<strong>in</strong>aries<br />

referrential constituents like nouns and ma<strong>in</strong> verbs as they refer to objects and events,<br />

respectively. Pronouns, however, do not <strong>in</strong>duce memory cost because they are assumed<br />

to be immediately accessible. The assumption beh<strong>in</strong>d Integration Cost is that every<br />

stored item receives an activation which decays depend<strong>in</strong>g on the number <strong>of</strong> newly encoded<br />

discourse referents while it is ma<strong>in</strong>ta<strong>in</strong>ed <strong>in</strong> memory. Integrat<strong>in</strong>g an element, i.e.,<br />

relat<strong>in</strong>g it to its head, needs more process<strong>in</strong>g effort when the element has less activation.<br />

Thus the <strong>in</strong>tegration cost is a function, monotonously <strong>in</strong>creas<strong>in</strong>g with the number <strong>of</strong><br />

<strong>in</strong>terven<strong>in</strong>g discourse referents. The cost accounts only implicitly for decay over time<br />

s<strong>in</strong>ce time is only represented discretely by successive discourse referents. The unit <strong>of</strong><br />

Integration Cost is energy units (EUs).<br />

The memory capacity limit is accounted for by the second pr<strong>in</strong>ciple <strong>of</strong> DLT: Storage<br />

Cost. It rests on the assumption that the parser constantly predicts the most probable<br />

complete sentence structure given the previous material and keeps it <strong>in</strong> memory.<br />

Structural complexity is calculated by the number <strong>of</strong> syntactic heads conta<strong>in</strong>ed. The<br />

more complex the predicted structure, the more syntactic heads it conta<strong>in</strong>s. Every predicted<br />

head uses up memory resources, so-called memory units (MUs). Memory load<br />

also affects process<strong>in</strong>g, because storage and process<strong>in</strong>g use the same resources (Just and<br />

Carpenter, 1992). Consequently, the more heads are predicted the higher the process<strong>in</strong>g<br />

cost. The important difference between the two costs is the location <strong>of</strong> their effects.<br />

While Integration Cost accounts for process<strong>in</strong>g differences only at the <strong>in</strong>tegration site,<br />

Storage Cost for a predicted structure affects process<strong>in</strong>g <strong>of</strong> every follow<strong>in</strong>g part <strong>of</strong> the<br />

sentence. Figure 1.1 shows the Integration Cost C(I) and the Storage Cost C(S) at<br />

each po<strong>in</strong>t <strong>in</strong> an English object relative clause. See<strong>in</strong>g the sentence-<strong>in</strong>itial determ<strong>in</strong>er<br />

ORC The reporter whoi the senator attacked ei admitted the error<br />

C(I) 0 0 0 0 0 1+2 3 0 0+1<br />

C(S) 2 1 3 4 3 1 1 1 0<br />

Total 2 1 3 4 3 3 4 1 1<br />

Figure 1.1: DLT cost metrics for an English ORC accord<strong>in</strong>g to Gibson (1998).<br />

<strong>in</strong>duces the prediction <strong>of</strong> a ma<strong>in</strong> clause. Hence predictions for an NP and a ma<strong>in</strong> verb<br />

have to be stored. Note that DLT considers the prediction <strong>of</strong> the ma<strong>in</strong> verb as cost-free,<br />

but <strong>in</strong> literature, it is mostly assigned a cost. For simplicity, <strong>in</strong> this work Storage Cost<br />

will be consistently assumed for the ma<strong>in</strong> verb. Hav<strong>in</strong>g completed the NP only the verb<br />

is predicted. At the relative pronoun who a Storage Cost <strong>of</strong> 3 is assigned because an<br />

embedded SRC is predicted, conta<strong>in</strong><strong>in</strong>g two heads: the embedded verb and a subject<br />

gap. See<strong>in</strong>g another determ<strong>in</strong>er changes the prediction <strong>in</strong>to an ORC, which conta<strong>in</strong>s<br />

one more head, namely the embedded subject. On senator only the embedded verb,<br />

the object gap, and the ma<strong>in</strong> verb stay predicted. On the embedded verb attacked then<br />

two <strong>in</strong>tegrations take place. The subject <strong>in</strong>tegration <strong>of</strong> attacked costs 1 EU because the<br />

8

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!