14.02.2013 Views

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

236 Chapter 16. Proc. ICA 2006, pages 917-925<br />

where the fourth-order cumulant tensor is def<strong>in</strong>ed as cum(Xi, X j, Xk, Xl) := E(XiX jXkXl)−<br />

E(XiX j)E(XkXl)−E(XiXk)E(X jXl)−E(XiXl)E(X jXK). The deviation (ii) from Gaussianity<br />

of XG can simply be measured by kurtosis, which <strong>in</strong> the case of white X means<br />

δG(X) :=<br />

1<br />

(n−d)<br />

d�<br />

j=n+1<br />

�<br />

�<br />

�E(X 4 i )−3� � �.<br />

Altogether, we can therefore def<strong>in</strong>e a total model deviation as the weighted sum of the<br />

above <strong>in</strong>dices; the weight <strong>in</strong> the follow<strong>in</strong>g was chosen experimentally to approximately<br />

yield even contributions of the two measures:<br />

δ(X)=10n(d−n)δI(X)+δG(X)<br />

For numerical tests, we generate two different non-Gaussian source data sets, see<br />

figure 1(d) and also [4], figure 1. The first source set (I) is an n-dimensional dependent<br />

sub-Gaussian random vector given by an isotropic uniform density with<strong>in</strong> the unit<br />

disc, and source set (II) a 2-dimensional dependent super- and sub-Gaussian, given by<br />

p(s1, s2)∝exp(−|s1|)1[c(s1),c(s1)+1], where c(s1)=0 if|s1|≤ln 2 and c(s1)=−1 otherwise.<br />

Normalization was chosen to guarantee Cov(S N)=I<strong>in</strong> advance.<br />

In order to test for model violations, we have to f<strong>in</strong>d two representations X=AS<br />

and X=A ′ S ′ of the same mixtures. After multiplication by A −1 we may as before<br />

assume that a s<strong>in</strong>gle representation X=AS is given with X and S both fulfill<strong>in</strong>g the<br />

dimension reduction model (1), and we have to show that ANG = AGN = 0 if the<br />

decomposition is m<strong>in</strong>imal (corollary 1). The latter can be tested numerically by us<strong>in</strong>g<br />

the so-called normalized crosserror<br />

E(A) :=<br />

1<br />

2n(d−n)<br />

� �ANG� 2 F +�AGN� 2 F<br />

where�.�F is some matrix norm, <strong>in</strong> our case the Frobenius norm.<br />

In order to reduce the d 2 -dimensional search space, after whiten<strong>in</strong>g we may assume<br />

that A∈O(d), so only d(d− 1)/2 dimensions have to be searched. O(d) can be uniformly<br />

sampled for example by choos<strong>in</strong>g B with Gaussian i.i.d. coefficients and orthogonaliz<strong>in</strong>g<br />

A := (BB ⊤ ) −1/2 B. We perform 10 4 Monte-Carlo runs with random A∈O(d).<br />

Sources have been generated with T= 10 4 samples, n-dimensional non-Gaussian part<br />

(a) and (b) from above, and (d−n)-dimensional i.i.d. Gaussians. We measure model<br />

deviationδ(AS) and compare it with the deviation E(A) from block-diagonality.<br />

The results for vary<strong>in</strong>g parameters are given <strong>in</strong> figure 1(a-c). In all three cases we<br />

observe that the smaller the model deviation, the smaller also the crosserror. This gives<br />

an asymptotic confirmation of corollary 1, <strong>in</strong>dicat<strong>in</strong>g that by random sampl<strong>in</strong>g no nonuniqueness<br />

realizations have been found.<br />

3 Conclusion<br />

By m<strong>in</strong>imality of the decomposition (1), we gave a necessary condition for the uniqueness<br />

of non-Gaussian subspace analysis. Together with the assumption of exist<strong>in</strong>g covariance,<br />

this was already sufficient to guarantee model uniqueness. Our result allows<br />

NGSA algorithms to f<strong>in</strong>d the unknown, unique signal space with<strong>in</strong> a noisy highdimensional<br />

data set [6].<br />

� ,

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!