06.02.2013 Views

Abstract book (pdf) - ICPR 2010

Abstract book (pdf) - ICPR 2010

Abstract book (pdf) - ICPR 2010

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

This paper proposes an incremental scheme for visual landmark learning and recognition. The feature selection stage characterises<br />

the landmark using the Opponent SIFT, a color-based variant of the SIFT descriptor. To reduce the dimensionality<br />

of this descriptor, an incremental non-parametric discriminant analysis is conducted to seek directions for efficient discrimination<br />

(incremental eigenspace learning). On the other hand, the classification stage uses the incremental envolving clustering<br />

method (ECM) to group feature vectors into a set of clusters (incremental prototype learning). Then, the final classification<br />

is conducted based on the k-nearest neighbor approach, whose prototypes were updated by the ECM. This global scheme<br />

enables a classifier to learn incrementally, on-line, and in one pass. Besides, the ECM allows to reduce the memory and computation<br />

expenses. Experimental results show that the proposed recognition system is well suited to be used by an autonomous<br />

mobile robot.<br />

13:30-16:30, Paper ThBCT8.27<br />

Subspace Methods with Globally/Locally Weighted Correlation Matrix<br />

Yamashita, Yukihiko, Tokyo Inst. of Tech.<br />

Wakahara, Toru, Hosei Univ.<br />

The discriminant function of a subspace method is provided by using correlation matrices that reflect the averaged feature<br />

of a category. As a result, it will not work well on unknown input patterns that are far from the average. To address this problem,<br />

we propose two kinds of weighted correlation matrices for subspace methods. The globally weighted correlation matrix<br />

(GWCM) attaches importance to training patterns that are far from the average. Then, it can reflect the distribution of patterns<br />

around the category boundary more precisely. The computational cost of a subspace method using GWCMs is almost the<br />

same as that using ordinary correlation matrices. The locally weighted correlation matrix (LWCM) attaches importance to<br />

training patterns that arenear to an input pattern to be classified. Then, it can reflect the distribution of training patterns<br />

around the input pattern in more detail. The computational cost of a subspace method with LWCM at the recognition stage<br />

does not depend on the number of training patterns, while those of the conventional adaptive local and the nonlinear subspace<br />

methods do. We show the advantages of the proposed methods by experiments made on the MNIST database of handwritten<br />

digits.<br />

13:30-16:30, Paper ThBCT8.28<br />

The Binormal Assumption on Precision-Recall Curves<br />

Brodersen, Kay Henning, ETH Zurich<br />

Ong, Cheng Soon, ETH Zurich<br />

Stephan, Klaas Enno, Univ. of Zurich<br />

Buhmann, Joachim M., Swiss Federal Inst. of Tech. Zurich<br />

The precision-recall curve (PRC) has become a widespread conceptual basis for assessing classification performance. The curve<br />

relates the positive predictive value of a classifier to its true positive rate and often provides a useful alternative to the well-known<br />

receiver operating characteristic (ROC). The empirical PRC, however, turns out to be a highly imprecise estimate of the true curve,<br />

especially in the case of a small sample size and class imbalance in favour of negative examples. Ironically, this situation tends to<br />

occur precisely in those applications where the curve would be most useful, e.g., in anomaly detection or information retrieval. Here,<br />

we propose to estimate the PRC on the basis of a simple distributional assumption about the decision values that generalizes the established<br />

binormal model for estimating smooth ROC curves. Using simulations, we show that our approach outperforms empirical<br />

estimates, and that an account of the class imbalance is crucial for obtaining unbiased PRC estimates.<br />

13:30-16:30, Paper ThBCT8.29<br />

Incremental Training of Multiclass Support Vector Machines<br />

Nikitidis, Symeon, Centre for Res. and Tech. Hellas<br />

Nikolaidis, Nikos, Aristotle Univ. of Thessaloniki<br />

Pitas, Ioannis, -<br />

We present a new method for the incremental training of multiclass Support Vector Machines that provides computational efficiency<br />

for training problems in the case where the training data collection is sequentially enriched and dynamic adaptation of the classifier<br />

is required. An auxiliary function that incorporates some desired characteristics in order to provide an upper bound of the objective<br />

function which summarizes the multiclass classification task has been designed and the global minimizer for the enriched dataset is<br />

found using a warm start algorithm, since faster convergence is expected when starting from the previous global minimum. Experimental<br />

evidence on two data collections verified that our method is faster than retraining the classifier from scratch, while the<br />

achieved classification accuracy is maintained at the same level.<br />

- 305 -

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!