14.02.2013 Views

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Chapter 2. Neural Computation 16:1827-1850, 2004 77<br />

1848 F. Theis<br />

on Ik. hk was chosen to be analytic, and equation 5.4 holds on the open set<br />

Ik, so it holds on all R. Apply<strong>in</strong>g lemma 3 then shows that either h ′ k ≡ 0or<br />

h ′ �<br />

ck<br />

k (x) =±exp<br />

2 x2 �<br />

+ dkx + ek , x ∈ R (5.5)<br />

with constants dk, ek ∈ R. By assumption, hk is bijective, so h ′ k �≡ 0.<br />

Apply<strong>in</strong>g the same arguments as above to the <strong>in</strong>verse system<br />

S = A −1 (h −1 (W −1 X))<br />

and us<strong>in</strong>g the fact that also pS is somewhere locally constant nonzero shows<br />

that equation 5.5 also holds for (h −1<br />

k )′ with other constants. But if both the<br />

derivatives of hk and h −1<br />

k are of this exponential type, then ck = dk = 0, and<br />

therefore hk is aff<strong>in</strong>e l<strong>in</strong>ear for all k = 1,...,n, which completes the proof of<br />

postnonl<strong>in</strong>ear separability <strong>in</strong> this special case.<br />

Note that <strong>in</strong> the above proof, local positiveness of the densities was assumed<br />

<strong>in</strong> order to use the equivalence of local separability with the diagonality<br />

of the logarithm of the Hessian. Hence, these results can be generalized<br />

us<strong>in</strong>g theorem 1 <strong>in</strong> a similar fashion as we did <strong>in</strong> the l<strong>in</strong>ear case with theorem<br />

2. Hence, we have proven postnonl<strong>in</strong>ear separability also for uniformly<br />

distributed sources.<br />

6 Conclusion<br />

We have shown how to derive the separability of l<strong>in</strong>ear BSS us<strong>in</strong>g diagonalization<br />

of the Hessian of the logarithmic density, respectively, characteristic<br />

function. This <strong>in</strong>duces separated, that is, <strong>in</strong>dependent, sources. The idea of<br />

Hessian diagonalization is put <strong>in</strong>to a new algorithm for perform<strong>in</strong>g l<strong>in</strong>ear<br />

<strong>in</strong>dependent component analysis, which is shown to be a local problem. In<br />

practice, however, due to the fact that the densities cannot be approximated<br />

locally very well, we also propose a diagonalization algorithm that takes<br />

the global structure <strong>in</strong>to account. In order to show the use of this framework<br />

of separated functions, we f<strong>in</strong>ish with a proof of postnonl<strong>in</strong>ear separability<br />

<strong>in</strong> a special case.<br />

In future work, more general separability results of postnonl<strong>in</strong>ear BSS<br />

could be constructed by f<strong>in</strong>d<strong>in</strong>g more general solutions of the differential<br />

equation 5.3. Algorithmic improvements could be made by us<strong>in</strong>g other<br />

density approximation methods like mixture of gaussian models or by approximat<strong>in</strong>g<br />

the Hessian itself us<strong>in</strong>g the cumulative density and discrete<br />

approximations of the differential. F<strong>in</strong>ally, the diagonalization algorithm<br />

can easily be extended to nonl<strong>in</strong>ear situations by f<strong>in</strong>d<strong>in</strong>g appropriate model<br />

parameterizations; <strong>in</strong>stead of m<strong>in</strong>imiz<strong>in</strong>g the mutual <strong>in</strong>formation, we m<strong>in</strong>imize<br />

the absolute value of the off-diagonal terms of the logarithmic Hessian.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!