28.07.2013 Views

Shared Gaussian Process Latent Variables Models - Oxford Brookes ...

Shared Gaussian Process Latent Variables Models - Oxford Brookes ...

Shared Gaussian Process Latent Variables Models - Oxford Brookes ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

2.7. GAUSSIAN PROCESSES 39<br />

GTM is non-convex which means that we cannot be guaranteed to find the global<br />

optima. Further, the GTM suffers from problems associated with mixture models<br />

in high dimensional spaces [59].<br />

2.7 <strong>Gaussian</strong> <strong>Process</strong>es<br />

A D dimensional <strong>Gaussian</strong> distribution is defined by a D × 1 mean and a D × D<br />

covariance matrix. A <strong>Gaussian</strong> process (GP) is the infinite dimensional general-<br />

ization of the distribution where the mean and covariance is defined not by fixed<br />

size matrices but a mean µ(x) and a covariance k(x,x ′ ) function, defined over<br />

infinite index sets, x.<br />

GP(µ(x), k(x,x ′ )). (2.43)<br />

Evaluating a GP over a finite index set reduces the process to a distribution<br />

with the dimensionality of the cardinality of the evaluation set. The covariance<br />

function needs to specify a valid covariance matrix when evaluated for any finite<br />

subset in its domain, this requires the covariance function to come from the same<br />

family of functions as Mercer kernels [41, 45].<br />

A GP generalizes the concept of a <strong>Gaussian</strong> distribution to infinite dimen-<br />

sions, this has been exploited in machine learning by applying GPs to specify<br />

distributions over infinite objects. One such application is when we are interested<br />

in modeling relationships defined over continuous domains such as functions. If<br />

we are interested in modeling a functional relationship f between input domain

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!