02.05.2014 Views

Proceedings - Österreichische Gesellschaft für Artificial Intelligence

Proceedings - Österreichische Gesellschaft für Artificial Intelligence

Proceedings - Österreichische Gesellschaft für Artificial Intelligence

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Compositionality in (high-dimensional) space<br />

Marco Baroni<br />

Università di Trento<br />

Abstract<br />

Formal semanticists have developed sophisticated compositional theories of sentential meaning, paying<br />

a lot of attention to those grammatical words (determiners, logical connectives, etc.) that constitute<br />

the functional scaffolding of sentences. Corpus-based computational linguists, on the other hand,<br />

have developed powerful distributional methods to induce representations of the meaning of contentrich<br />

words (nouns, verbs, etc.), typically discarding the functional scaffolding as "stop words". Since<br />

we do not communicate by logical formulas, nor, Tarzan-style, by flat concatenation of content<br />

words, a satisfactory model of the semantics of natural language should strike a balance between the<br />

two approaches. In this talk, I will present some recent proposals that try to get the best of both<br />

worlds by adapting the classic view of compositionality as function application developed by formal<br />

semanticists to distributional models of meaning. I will present preliminary evidence of the effectiveness<br />

of these methods in scaling up to the phrasal and sentential domains, and discuss to what extent<br />

the representations of phrases and sentences we get out of compositional distributional semantics are<br />

related to what formal semanticists are trying to capture.<br />

20<br />

<strong>Proceedings</strong> of KONVENS 2012 (Invited talks), Vienna, September 20, 2012

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!