Abstract book (pdf) - ICPR 2010
Abstract book (pdf) - ICPR 2010
Abstract book (pdf) - ICPR 2010
- TAGS
- abstract
- icpr
- icpr2010.org
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
09:00-11:10, Paper TuAT8.41<br />
A Variational Bayesian EM Algorithm for Tree Similarity<br />
Takasu, Atsuhiro, National Inst. of Informatics<br />
Fukagawa, Daiji, National Inst. of Informatics<br />
Akutsu, Tatsuya, Kyoto Univ.<br />
In recent times, a vast amount of tree-structured data has been generated. For mining, retrieving, and integrating such data,<br />
we need a fine-grained tree similarity measure that can be adapted to objective data. To achieve this goal, this paper (1)<br />
proposes a probabilistic generative model that generates pairs of similar trees, and (2) derives a learning algorithm for estimating<br />
the parameters of the model based on the variational Bayesian expectation maximization (VBEM) method. This<br />
method can handle rooted, ordered, and labeled trees. We show that the tree similarity model obtained via the BEM technique<br />
performs better than that obtained via maximum likelihood estimation by tuning the hyper parameters.<br />
09:00-11:10, Paper TuAT8.42<br />
Enhancing Image Classification with Class-Wise Clustered Vocabularies<br />
Wojcikiewicz, Wojciech, Fraunhofer Inst. FIRST<br />
Kawanabe, Motoaki, Fraunhofer FIRST and TU Berlin<br />
Binder, Alexander, Fraunhofer Inst. FIRST, Berlin<br />
In recent years bag-of-visual-words representations have gained increasing popularity in the field of image classification.<br />
Their performance highly relies on creating a good visual vocabulary from a set of image features (e.g. SIFT). For realworld<br />
photo archives such as Flicker, code<strong>book</strong>s with larger than a few thousand words are desirable, which is infeasible<br />
by the standard k-means clustering. In this paper, we propose a two-step procedure which can generate more informative<br />
code<strong>book</strong>s efficiently by class-wise k-means and a novel procedure for word selection. Our approach was compared favorably<br />
to the standard k-means procedure on the PASCAL VOC data sets.<br />
09:00-11:10, Paper TuAT8.43<br />
Efficiently Computing Optimal Consensus of Digital Line Fitting<br />
Kenmochi, Yukiko, Univ. Paris-Est<br />
Buzer, Lilian, ESIEE<br />
Talbot, Hugues, ESIEE<br />
Given a set of discrete points in a 2D digital image containing noise, we formulate our problem as robust digital line<br />
fitting. More precisely, we seek the maximum subset whose points are included in a digital line, called the optimal consensus.<br />
The paper presents an efficient method for exactly computing the optimal consensus by using the topological<br />
sweep, which provides us with the quadratic time complexity and the linear space complexity with respect to the number<br />
of input points.<br />
09:00-11:10, Paper TuAT8.44<br />
Learning a Joint Manifold Representation from Multiple Data Sets<br />
Torki, Marwan, Rutgers Univ.<br />
Elgammal, Ahmed, Rutgers Univ.<br />
Lee, Chan-Su, Yeungnam Univ.<br />
The problem we address in the paper is how to learn a joint representation from data lying on multiple manifolds. We are<br />
given multiple data sets and there is an underlying common manifold among the different data set. We propose a framework<br />
to learn an embedding of all the points on all the manifolds in a way that preserves the local structure on each manifold<br />
and, in the same time, collapses all the different manifolds into one manifold in the embedding space, while preserving<br />
the implicit correspondences between the points across different data sets. The proposed solution works as extensions to<br />
current state of the art spectral-embedding approaches to handle multiple manifolds.<br />
09:00-11:10, Paper TuAT8.45<br />
A Multi-Scale Approach to Decompose a Digital Curve into Meaningful Parts<br />
Nguyen, Thanh Phuong, LORIA<br />
Debled-Rennesson, Isabelle, LORIA – Nancy Univ.<br />
- 91 -