20.07.2013 Views

Notes on computational linguistics.pdf - UCLA Department of ...

Notes on computational linguistics.pdf - UCLA Department of ...

Notes on computational linguistics.pdf - UCLA Department of ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Stabler - Lx 185/209 2003<br />

(100) Abney (1996a, p21) says:<br />

In fact, probabilities make Markov models more adequate then their n<strong>on</strong>-probabilistic counterparts,<br />

not less adequate. Markov models are surprisingly effective, given their finite-state<br />

substrate. For example, they are the workhorse <strong>of</strong> speech recogniti<strong>on</strong> technology. Stochastic<br />

grammars can also be easier to learn than their n<strong>on</strong>-stochastic counterparts…<br />

We might agree about the interest <strong>of</strong> (n<strong>on</strong>-finite state) stochastic grammars. Certainly, developing<br />

stochastic grammars, <strong>on</strong>e <strong>of</strong> the main questi<strong>on</strong>s is: which grammars, which structural relati<strong>on</strong>s do we<br />

find in human languages? This is the traditi<strong>on</strong>al focus <strong>of</strong> theoretical <strong>linguistics</strong>. As for the stochastic<br />

influences, it is not yet clear what they are, or how revealing they will be.<br />

As for the first sentence in this quoted passage, and the general idea that we can develop good stochastic<br />

models without attenti<strong>on</strong> to the expressive capabilities <strong>of</strong> the “substrate,” you decide.<br />

(101) It is quite possible that “lexical activati<strong>on</strong>” is sensitive to word co-occurrence frequencies, and this might<br />

be modeled with a probabilistic finite automat<strong>on</strong> (e.g. a state-labeled Markov model or a standard,<br />

transiti<strong>on</strong>-labeled probabilistic fsa).<br />

The problem <strong>of</strong> detecting stochastic influences in the grammar itself depends <strong>on</strong> knowing what parts <strong>of</strong><br />

the grammar depend <strong>on</strong> the lexical item. In CFGs, for example, we get <strong>on</strong>ly a simple category for each<br />

word, but in lexicalized TAGs, and in recent transformati<strong>on</strong>al grammars, the lexical item can provide a<br />

rich specificati<strong>on</strong> <strong>of</strong> its role in derivati<strong>on</strong>s.<br />

150

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!