06.02.2013 Views

Abstract book (pdf) - ICPR 2010

Abstract book (pdf) - ICPR 2010

Abstract book (pdf) - ICPR 2010

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

15:00-17:10, Paper MoBT9.38<br />

Manifold Modeling with Learned Distance in Random Projection Space for Face Recognition<br />

Tsagkatakis, Grigorios, Rochester Inst. of Tech.<br />

Savakis, Andreas, Rochester Inst. of Tech.<br />

In this paper, we propose the combination of manifold learning and distance metric learning for the generation of a representation<br />

that is both discriminative and informative, and we demonstrate that this approach is effective for face recognition.<br />

Initial dimensionality reduction is achieved using random projections, a computationally efficient and data independent<br />

linear transformation. Distance metric learning is then applied to increase the separation between classes and improve the<br />

accuracy of nearest neighbor classification. Finally, a manifold learning method is used to generate a mapping between<br />

the randomly projected data and a low dimensional manifold. Face recognition results suggest that the combination of<br />

distance metric learning and manifold learning can increase performance. Furthermore, random projections can be applied<br />

as an initial step without significantly affecting the classification accuracy.<br />

15:00-17:10, Paper MoBT9.39<br />

Part Detection, Description and Selection based on Hidden Conditional Random Fields<br />

Lu, Wenhao, Tsinghua Univ.<br />

Wang, Shengjin, Tsinghua Univ.<br />

Ding, Xiaoqing, Tsinghua Univ.<br />

In this paper, the problem of part detection, description and selection is discussed. This problem is crucial in the learning<br />

algorithms of part-based models, but cannot be solved well when some candidate parts are extracted from background.<br />

This paper studies this problem and introduces a new algorithm, HCRF-PS (Hidden Conditional Random Fields for Part<br />

Selection), for part detection, description, especially selection. Our algorithm is distinguished for its power to optimize<br />

multiple kinds of information at the same time, including texture, color, location and part label. Finally, we did some experiments<br />

with HCRF-PS algorithm which give good results on both virtual and real data.<br />

15:00-17:10, Paper MoBT9.40<br />

Boosting Bayesian MAP Classification<br />

Piro, Paolo, CNRS/Univ. of Nice-Sophia Antipolis<br />

Nock, Richard, Univ. des Antilles et de la Guyane<br />

Nielsen, Frank, Ec. Pol.<br />

Barlaud, Michel, CNRS/Univ. of Nice-Sophia Antipolis<br />

In this paper we redefine and generalize the classic k-nearest neighbors (k-NN) voting rule in a Bayesian maximum-aposteriori<br />

(MAP) framework. Therefore, annotated examples are used for estimating pointwise class probabilities in the<br />

feature space, thus giving rise to a new instance-based classification rule. Namely, we propose to ``boost’’ the classic k-<br />

NN rule by inducing a strong classifier from a combination of sparse training data, called ``prototypes’’. In order to learn<br />

these prototypes, our MapBoost algorithm globally minimizes a multiclass exponential risk defined over the training data,<br />

which depends on the class probabilities estimated at sample points themselves. We tested our method for image categorization<br />

on three benchmark databases. Experimental results show that MapBoost significantly outperforms classic k-NN<br />

(up to 8%). Interestingly, due to the supervised selection of sparse prototypes and the multiclass classification framework,<br />

the accuracy improvement is obtained with a considerable computational cost reduction.<br />

15:00-17:10, Paper MoBT9.41<br />

Weighting of the K-Nearest-Neighbors<br />

Chernoff, Konstantin, Univ. of Copenhagen<br />

Nielsen, Mads<br />

This paper presents two distribution independent weighting schemes for k-Nearest-Neighbors (kNN). Applying the first<br />

scheme in a Leave-One-Out (LOO) setting corresponds to performing complete b-fold cross validation (b-CCV), while<br />

applying the second scheme corresponds to performing bootstrapping in the limit of infinite iterations. We demonstrate<br />

that the soft kNN errors obtained through b-CCV can be obtained by applying the weighted kNN in a LOO setting, and<br />

that the proposed weighting schemes can decrease the variance and improve the generalization of kNN in a CV setting.<br />

- 62 -

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!