29.08.2013 Views

Connectionist Modeling of Experience-based Effects in Sentence ...

Connectionist Modeling of Experience-based Effects in Sentence ...

Connectionist Modeling of Experience-based Effects in Sentence ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Chapter 4 Two SRN Prediction Studies<br />

verbs and is not <strong>in</strong>volved <strong>in</strong> long-distant dependencies. Hence, the activation pattern<br />

represent<strong>in</strong>g it should not be too complex. In fact the learn<strong>in</strong>g <strong>of</strong> comma usage <strong>in</strong> ORCs<br />

can be scaled down to a count<strong>in</strong>g recursion problem <strong>of</strong> the pattern aabb <strong>in</strong>stead <strong>of</strong> abba.<br />

As discussed <strong>in</strong> chapter 3 count<strong>in</strong>g recursion is the easiest <strong>of</strong> the three recursion types<br />

for both humans and connectionist networks (Christiansen and Chater, 1999). Thus,<br />

it is very likely that the <strong>in</strong>clusion <strong>of</strong> commas facilitates process<strong>in</strong>g <strong>in</strong> the grammatical<br />

condition lower<strong>in</strong>g the respective GPE values.<br />

(25) English with commas:<br />

a. SRC: S1 , V2 O2 , V3 O3 , V1 O1<br />

b. ORC: S1 , S2 , S3 V3 , V2 , V1 O1<br />

(26) Example test sentences:<br />

a. the banker , that the banker , that the senators phone , understands , attacks<br />

the reporters . (no-drop)<br />

b. the lawyer , that the senator , that the judges attack , praises the judge .<br />

(drop-V2)<br />

Results for 3b<br />

See figure 4.7 for the results <strong>of</strong> simulation 3b after one (left panel) and three epochs (right<br />

panel). Compared to simulation 3a there was a global improvement for both conditions.<br />

The most dramatic improvement happened on V3, which is predicted almost without<br />

errors after three epochs. Look<strong>in</strong>g at the first epoch there was more improvement due<br />

to comma <strong>in</strong>sertion on V1 for the grammatical condition. In result the V1 error was the<br />

same <strong>in</strong> both conditions. However, after subsequent tra<strong>in</strong><strong>in</strong>g the no-drop condition did<br />

not change on V1 whereas the drop-V2 condition improved further result<strong>in</strong>g <strong>in</strong> a drop-V2<br />

preference on V1. The opposite happened on post-V1 where tra<strong>in</strong><strong>in</strong>g had affected the<br />

no-drop condition more. Here tra<strong>in</strong><strong>in</strong>g did not affect the ungrammatical condition at all.<br />

In summary, there was a comma <strong>in</strong>sertion × condition × tra<strong>in</strong><strong>in</strong>g <strong>in</strong>teraction, result<strong>in</strong>g<br />

<strong>in</strong> a drop-V2 preference after completed tra<strong>in</strong><strong>in</strong>g. The stable error on post-V1 <strong>in</strong> the<br />

drop-V2 condition can be <strong>in</strong>terpreted as a floor effect. The prediction <strong>of</strong> the determ<strong>in</strong>er<br />

and the noun is very good already with a GPE value around 0.1. It is very unlikely<br />

that the SRN learns the perfectly correct probabilities result<strong>in</strong>g <strong>in</strong> a GPE value <strong>of</strong> zero<br />

even after many epochs. Therefore, on the post-V1 region improvement by tra<strong>in</strong><strong>in</strong>g<br />

is only possible for the slightly worse grammatical condition, which is why the two<br />

conditions settle on the same error value after three epochs. In conclusion, the <strong>in</strong>sertion<br />

<strong>of</strong> commas def<strong>in</strong>itely helps to make better predictions. However, tra<strong>in</strong><strong>in</strong>g effects seem to<br />

be driven by rather local consistency, affect<strong>in</strong>g the ungrammatical condition more than<br />

the grammatical. Thus, look<strong>in</strong>g at V1 after three epochs the drop-V2 preference seems<br />

to be stable for English center-embedd<strong>in</strong>g.<br />

76

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!