FINITUDE GÉOMÉTRIQUE EN GÉOMÉTRIE DE HILBERT
finitude géométrique en géométrie de hilbert - Mickael Crampon - Free
finitude géométrique en géométrie de hilbert - Mickael Crampon - Free
- No tags were found...
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
Temporal Restricted Boltzmann Machines for Dependency ParsingNikhil GargDepartment of Computer ScienceUniversity of GenevaSwitzerlandnikhil.garg@unige.chJames HendersonDepartment of Computer ScienceUniversity of GenevaSwitzerlandjames.henderson@unige.chAbstractWe propose a generative model based onTemporal Restricted Boltzmann Machines fortransition based dependency parsing. Theparse tree is built incrementally using a shiftreduceparse and an RBM is used to modeleach decision step. The RBM at the currenttime step induces latent features with the helpof temporal connections to the relevant previoussteps which provide context information.Our parser achieves labeled and unlabeled attachmentscores of 88.72% and 91.65% respectively,which compare well with similarprevious models and the state-of-the-art.1 IntroductionThere has been significant interest recently in machinelearning methods that induce generative modelswith high-dimensional hidden representations,including neural networks (Bengio et al., 2003; Collobertand Weston, 2008), Bayesian networks (Titovand Henderson, 2007a), and Deep Belief Networks(Hinton et al., 2006). In this paper, we investigatehow these models can be applied to dependencyparsing. We focus on Shift-Reduce transition-basedparsing proposed by Nivre et al. (2004). In this classof algorithms, at any given step, the parser has tochoose among a set of possible actions, each representingan incremental modification to the partiallybuilt tree. To assign probabilities to these actions,previous work has proposed memory-based classifiers(Nivre et al., 2004), SVMs (Nivre et al., 2006b),and Incremental Sigmoid Belief Networks (ISBN)(Titov and Henderson, 2007b). In a related earlierwork, Ratnaparkhi (1999) proposed a maximum entropymodel for transition-based constituency parsing.Of these approaches, only ISBNs induce highdimensionallatent representations to encode parsehistory, but suffer from either very approximate orslow inference procedures.We propose to address the problem of inferencein a high-dimensional latent space by using an undirectedgraphical model, Restricted Boltzmann Machines(RBMs), to model the individual parsingdecisions. Unlike the Sigmoid Belief Networks(SBNs) used in ISBNs, RBMs have tractable inferenceprocedures for both forward and backward reasoning,which allows us to efficiently infer both theprobability of the decision given the latent variablesand vice versa. The key structural difference betweenthe two models is that the directed connectionsbetween latent and decision vectors in SBNsbecome undirected in RBMs. A complete parsingmodel consists of a sequence of RBMs interlinkedvia directed edges, which gives us a form of TemporalRestricted Boltzmann Machines (TRBM) (Tayloret al., 2007), but with the incrementally specifiedmodel structure required by parsing. In this paper,we analyze and contrast ISBNs with TRBMsand show that the latter provide an accurate andtheoretically sound model for parsing with highdimensionallatent variables.2 An ISBN Parsing ModelOur TRBM parser uses the same historybasedprobability model as the ISBNparser of Titov and Henderson (2007b):P(tree) = Π t P(v t |v 1 ,..., v t−1 ), where each11Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics:shortpapers, pages 11–17,Portland, Oregon, June 19-24, 2011. c○2011 Association for Computational Linguistics