28.07.2013 Views

Shared Gaussian Process Latent Variables Models - Oxford Brookes ...

Shared Gaussian Process Latent Variables Models - Oxford Brookes ...

Shared Gaussian Process Latent Variables Models - Oxford Brookes ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Abstract<br />

A fundamental task is machine learning is modeling the relationship between dif-<br />

ferent observation spaces. Dimensionality reduction is the task reducing the num-<br />

ber of dimensions in a parameterization of a data-set. In this thesis we are inter-<br />

ested in the cross-road between these two tasks: shared dimensionality reduction.<br />

<strong>Shared</strong> dimensionality reduction aims to represent multiple observation spaces<br />

within the same model. Previously suggested models have been limited to the<br />

scenarios where the observations have been generated from the same manifold.<br />

In this paper we present a <strong>Gaussian</strong> process <strong>Latent</strong> Variable Model (GP-LVM)<br />

[33] for shared dimensionality reduction without making assumptions about the<br />

relationship between the observations. Further we suggest an extension to Canon-<br />

ical Correlation Analysis (CCA) called Non Consolidating Component Analy-<br />

sis (NCCA). The proposed algorithm extends classical CCA to represent the full<br />

variance of the data opposed to only the correlated. We compare the suggested<br />

GP-LVM model to existing models and show results on real-world problems ex-<br />

emplifying the advantages of our approach.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!