28.07.2013 Views

Shared Gaussian Process Latent Variables Models - Oxford Brookes ...

Shared Gaussian Process Latent Variables Models - Oxford Brookes ...

Shared Gaussian Process Latent Variables Models - Oxford Brookes ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

2.4. SPECTRAL DIMENSIONALITY REDUCTION 23<br />

with the dimensionality of,<br />

rank(X) = rank(XX T ) = rank(G) = rank(D(X)) = d.<br />

In practice, for dimensionality reduction, we want to find a low dimensional repre-<br />

sentation of a set of data points i.e. vectorial data. In this case the Gram matrix G<br />

can be constructed directly from the data and a rank d dimensional approximation<br />

can be sought making the conversion step from distance matrix to gram matrix<br />

uneccesary.<br />

Principal Component Analysis (PCA) is a dimensionality reduction technique<br />

for embedding vectorial data in a dimensionally reduced representation. Given<br />

centered vectorial data Y the covariance matrix S = Y T Y has elements on the<br />

diagonal representing the variance along each dimension of the data while the off-<br />

diagonal elements measures the linear redundancies between dimensions. The<br />

objective of PCA is to find a projection v of the data Y such that the variance<br />

along each dimension is maximized,<br />

Objective: argmax v var(Yv) (2.19)<br />

subject to: v T v = 1. (2.20)<br />

This implies finding a projection of the data into a representation resulting in a

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!