24.01.2014 Views

Vector Space Semantic Parsing: A Framework for Compositional ...

Vector Space Semantic Parsing: A Framework for Compositional ...

Vector Space Semantic Parsing: A Framework for Compositional ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

References<br />

James A. Anderson. 1973. A theory <strong>for</strong> the recognition<br />

of items from short memorized lists. Psychological<br />

Review, 80(6):417 – 438.<br />

Yoshua Bengio. 2009. Learning deep architectures <strong>for</strong><br />

ai. Foundations and Trends in Machine Learning,<br />

2(1):1–127.<br />

Daniel M. Bikel. 2004. Intricacies of collins’ parsing<br />

model. Comput. Linguist., 30:479–511, December.<br />

Christian Bockermann, Martin Apel, and Michael<br />

Meier. 2009. Learning sql <strong>for</strong> database intrusion detection<br />

using context-sensitive modelling. In Detection<br />

of Intrusions andMalware & Vulnerability Assessment<br />

(DIMVA), pages 196–205.<br />

Eugene Charniak. 2000. A maximum-entropyinspired<br />

parser. In Proc. of the 1st NAACL, pages<br />

132–139, Seattle, Washington.<br />

Naom Chomsky. 1957. Aspect of Syntax Theory. MIT<br />

Press, Cambridge, Massachussetts.<br />

Michael Collins and Nigel Duffy. 2002. New ranking<br />

algorithms <strong>for</strong> parsing and tagging: Kernels over<br />

discrete structures, and the voted perceptron. In Proceedings<br />

of ACL02.<br />

Michael Collins. 2003. Head-driven statistical models<br />

<strong>for</strong> natural language parsing. Comput. Linguist.,<br />

29(4):589–637.<br />

R. Collobert and J. Weston. 2008. A unified architecture<br />

<strong>for</strong> natural language processing: Deep neural<br />

networks with multitask learning. In International<br />

Conference on Machine Learning, ICML.<br />

Ronan Collobert, Jason Weston, Léon Bottou, Michael<br />

Karlen, Koray Kavukcuoglu, and Pavel Kuksa.<br />

2011. Natural language processing (almost) from<br />

scratch. J. Mach. Learn. Res., 12:2493–2537,<br />

November.<br />

Nello Cristianini and John Shawe-Taylor. 2000. An<br />

Introduction to Support <strong>Vector</strong> Machines and Other<br />

Kernel-based Learning Methods. Cambridge University<br />

Press, March.<br />

Aron Culotta and Jeffrey Sorensen. 2004. Dependency<br />

tree kernels <strong>for</strong> relation extraction. In Proceedings<br />

of the 42nd Annual Meeting on Association <strong>for</strong> Computational<br />

Linguistics, ACL ’04, Stroudsburg, PA,<br />

USA. Association <strong>for</strong> Computational Linguistics.<br />

Patrick Düssel, Christian Gehl, Pavel Laskov, and Konrad<br />

Rieck. 2008. Incorporation of application layer<br />

protocol syntax into anomaly detection. In Proceedings<br />

of the 4th International Conference on In<strong>for</strong>mation<br />

Systems Security, ICISS ’08, pages 188–202,<br />

Berlin, Heidelberg. Springer-Verlag.<br />

Thomas Gärtner. 2003. A survey of kernels <strong>for</strong> structured<br />

data. SIGKDD Explorations.<br />

Daniel Gildea and Daniel Jurafsky. 2002. Automatic<br />

Labeling of <strong>Semantic</strong> Roles. Computational Linguistics,<br />

28(3):245–288.<br />

Gene Golub and William Kahan. 1965. Calculating<br />

the singular values and pseudo-inverse of a matrix.<br />

Journal of the Society <strong>for</strong> Industrial and Applied<br />

Mathematics, Series B: Numerical Analysis,<br />

2(2):205–224.<br />

Kosuke Hashimoto, Ichigaku Takigawa, Motoki Shiga,<br />

Minoru Kanehisa, and Hiroshi Mamitsuka. 2008.<br />

Mining significant tree patterns in carbohydrate<br />

sugar chains. Bioin<strong>for</strong>matics, 24:i167–i173, August.<br />

G. E. Hinton, J. L. McClelland, and D. E. Rumelhart.<br />

1986. Distributed representations. In D. E.<br />

Rumelhart and J. L. McClelland, editors, Parallel<br />

Distributed Processing: Explorations in the Microstructure<br />

of Cognition. Volume 1: Foundations.<br />

MIT Press, Cambridge, MA.<br />

Bill MacCartney, Trond Grenager, Marie-Catherine<br />

de Marneffe, Daniel Cer, and Christopher D. Manning.<br />

2006. Learning to recognize features of valid<br />

textual entailments. In Proceedings of the Human<br />

Language Technology Conference of the NAACL,<br />

Main Conference, pages 41–48, New York City,<br />

USA, June. Association <strong>for</strong> Computational Linguistics.<br />

M. P. Marcus, B. Santorini, and M. A. Marcinkiewicz.<br />

1993. Building a large annotated corpus of english:<br />

The penn treebank. Computational Linguistics,<br />

19:313–330.<br />

Bjrn-Helge Mevik and Ron Wehrens. 2007. The<br />

pls package: Principal component and partial least<br />

squares regression in r. Journal of Statistical Software,<br />

18(2):1–24, 1.<br />

Alessandro Moschitti, Daniele Pighin, and Roberto<br />

Basili. 2008. Tree kernels <strong>for</strong> semantic role labeling.<br />

Computational Linguistics, 34(2):193–224.<br />

Alessandro Moschitti. 2006. Efficient Convolution<br />

Kernels <strong>for</strong> Dependency and Constituent Syntactic<br />

Trees. In Proceedings of The 17th European Conference<br />

on Machine Learning, Berlin, Germany.<br />

Bennet B. Murdock. 1983. A distributed memory<br />

model <strong>for</strong> serial-order in<strong>for</strong>mation. Psychological<br />

Review, 90(4):316 – 338.<br />

Joakim Nivre, Johan Hall, Jens Nilsson, Atanas<br />

Chanev, Glsen Eryigit, Sandra Kübler, Svetoslav<br />

Marinov, and Erwin Marsi. 2007. Maltparser:<br />

A language-independent system <strong>for</strong> data-driven dependency<br />

parsing. Natural Language Engineering,<br />

13(2):95–135.<br />

Roger Penrose. 1955. A generalized inverse <strong>for</strong> matrices.<br />

In Proc. Cambridge Philosophical Society.<br />

T. A. Plate. 1994. Distributed Representations and<br />

Nested <strong>Compositional</strong> Structure. Ph.D. thesis.<br />

48

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!