14.02.2013 Views

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Chapter 2. Neural Computation 16:1827-1850, 2004 73<br />

1844 F. Theis<br />

Note that E represents a new approximate measure of <strong>in</strong>dependence.<br />

Therefore, the l<strong>in</strong>ear BSS algorithm can now be readily generalized to nonl<strong>in</strong>ear<br />

situations by f<strong>in</strong>d<strong>in</strong>g an appropriate parameterization of the possibly<br />

nonl<strong>in</strong>ear separat<strong>in</strong>g model.<br />

The proposed algorithm basically performs a global diagonalization of<br />

the logarithmic Hessian after prewhiten<strong>in</strong>g. Interest<strong>in</strong>gly, this is similar to<br />

traditional BSS algorithms based on jo<strong>in</strong>t diagonalization such as JADE<br />

(Cardoso & Souloumiac, 1993) us<strong>in</strong>g cumulant matrices or AMUSE (Tong,<br />

Liu, Soan, & Huang, 1991) and SOBI (Belouchrani, Meraim, Cardoso, &<br />

Moul<strong>in</strong>es, 1997) employ<strong>in</strong>g time decorrelation. Instead of us<strong>in</strong>g a global energy<br />

function as proposed above, we could therefore also jo<strong>in</strong>tly diagonalize<br />

a given set of Hessians (respectively, separator matrices, as above; see also<br />

Yeredor, 2000). Another relation to previously proposed ICA algorithms<br />

lies <strong>in</strong> the kernel approximation technique. Gaussian or generalized gaussian<br />

kernels have already been used <strong>in</strong> the field of <strong>in</strong>dependent component<br />

analysis to model the source densities (Lee & Lewicki, 2000; Habl, Bauer,<br />

Putonet, Rodriguez-Alvarez, & Lang, 2001), thus giv<strong>in</strong>g an estimate of the<br />

score function used <strong>in</strong> Bell-Sejnowski-type semiparametric algorithms (Bell,<br />

& Sejnowski, 1995) or enabl<strong>in</strong>g direct separation us<strong>in</strong>g a maximum likelihood<br />

parameter estimation. Our algorithm also uses density approximation,<br />

but employs this for the mixture density, which can be problematic <strong>in</strong> higher<br />

dimensions. A different approach not <strong>in</strong>volv<strong>in</strong>g density approximation is a<br />

direct sample-based Hessian estimation similar to L<strong>in</strong> (1998).<br />

5 Separability of Postnonl<strong>in</strong>ear BSS<br />

In this section, we show how to use the idea of Hessian diagonalization <strong>in</strong><br />

order to give separability proofs <strong>in</strong> nonl<strong>in</strong>ear situations, more precisely, <strong>in</strong><br />

the sett<strong>in</strong>g of postnonl<strong>in</strong>ear BSS. After stat<strong>in</strong>g the postnonl<strong>in</strong>ear BSS model<br />

and the general (to the knowledge of the author, not yet proven) separability<br />

theorem, we will prove postnonl<strong>in</strong>ear separability <strong>in</strong> the case of random<br />

vectors with distributions that are somewhere locally constant and nonzero<br />

(e.g., uniform distributions). A possible proof of postnonl<strong>in</strong>ear separability<br />

has been suggested by Taleb and Jutten (1999); however, the proof applies<br />

only to densities with at least one zero and furthermore conta<strong>in</strong>s an error<br />

render<strong>in</strong>g the proof applicable only to restricted situations.<br />

Def<strong>in</strong>ition 3. A function f : R n → R n is called diagonal if each component<br />

fi(x) of f(x) depends on only the variable xi.<br />

In this case, we often omit the other variables and write f(x1,...,xn) =<br />

( f1(x1),..., fn(xn)); sof ≡ f1 ×···× fn where × denotes the Cartesian<br />

product.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!