06.02.2013 Views

Abstract book (pdf) - ICPR 2010

Abstract book (pdf) - ICPR 2010

Abstract book (pdf) - ICPR 2010

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

13:30-16:30, Paper ThBCT8.52<br />

Combining Single Class Features for Improving Performance of a Two Stage Classifier<br />

Cordella, Luigi P., Univ. di Napoli Federico II<br />

De Stefano, Claudio, Univ. of Cassino<br />

Fontanella, Francesco, Univ. of Cassino<br />

Marrocco, Cristina, Univ. of Cassino<br />

Scotto Di Freca, Alessandra, Univ. of Cassino<br />

We propose a feature selection—based approach for improving classification performance of a two stage classification<br />

system in contexts where a high number of features is involved. A problem with a set of N classes is subdivided into a set<br />

of N two class problems. In each problem, a GA—based feature selection algorithm is used for finding the best subset of<br />

features. These subsets are then used for training N classifiers. In the classification phase, unknown samples are given in<br />

input to each of the trained classifiers by using the corresponding subspace. In case of conflicting responses, the sample<br />

is sent to a suitably trained supplementary classifier. The proposed approach has been tested on a real world dataset containing<br />

hyper—spectral image data. The results favourably compare with those obtained by other methods on the same<br />

data.<br />

13:30-16:30, Paper ThBCT8.53<br />

The Rex Leopold II Model: Application of the Reduced Set Density Estimator to Human Categorization<br />

De Schryver, Maarten, Ghent Univ.<br />

Roelstraete, Bjorn, Ghent Univ.<br />

Reduction techniques are important tools in machine learning and pattern recognition. In this article, we demonstrate how<br />

a kernel-based density estimator can be used as a tool for understanding human category representation. Despite the dominance<br />

of exemplar models of categorization, there is still ambiguity about the number of exemplars stored in memory.<br />

Here, we illustrate that by omitting exemplars categorization performance is not affected.<br />

13:30-16:30, Paper ThBCT8.54<br />

A Hybrid Method for Feature Selection based on Mutual Information and Canonical Correlation Analysis<br />

Sakar, Cemal Okan, Bahcesehir Univ.<br />

Kursun, Olcay, Istanbul Univ.<br />

Mutual Information (MI) is a classical and widely used dependence measure that generally can serve as a good feature selection<br />

algorithm. However, under-sampled classes or rare but certain relations are overlooked by this measure, which can<br />

result in missing relevant features that could be very predictive of variables of interest, such as certain phenotypes or disorders<br />

in biomedical research, rare but dangerous factors in ecology, intrusions in network systems, etc. On the other hand,<br />

Kernel Canonical Correlation Analysis (KCCA) is a nonlinear correlation measure effectively used to detect independence<br />

but its use for feature selection or ranking is limited due to the fact that its formulation is not intended to measure the<br />

amount of information (entropy) of the dependence. In this paper, we propose Predictive Mutual Information (PMI), a hybrid<br />

measure of relevance not only is based on MI but also accounts for predictability of signals from one another as in<br />

KCCA. We show that PMI has more improved feature detection capability than MI and KCCA, especially in catching<br />

suspicious coincidences that are rare but potentially important not only for subsequent experimental studies but also for<br />

building computational predictive models which is demonstrated on two toy datasets and a real intrusion detection system<br />

dataset.<br />

13:30-16:30, Paper ThBCT8.55<br />

Speech Magnitude-Spectrum Information-Entropy (MSIE) for Automatic Speech Recognition in Noisy Environments<br />

Nolazco-Flores, Juan A., Inst. Tecnológico y de Estudios Superiores de Monterrey<br />

Aceves-López, Roberto A., Inst. Tecnológico y de Estudios Superiores de Monterrey<br />

García-Perera, L. Paola, Inst. Tecnológico y de Estudios Superiores de Monterrey<br />

The Magnitude-Spectrum Information-Entropy (MSIE) of the speech signal is presented as an alternative representation<br />

of the speech that can be used to mitigate the mismatch between training and testing conditions. The speech-magnitude<br />

spectrum is considered as a random variable from which entropy coefficients can be calculated for each frame. By concatenating<br />

these entropic coefficients to its corresponding MFCC vector, then calculating the dynamic coefficients, and<br />

- 311 -

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!