14.02.2013 Views

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

22 Chapter 1. Statistical mach<strong>in</strong>e learn<strong>in</strong>g of biomedical data<br />

crosstalk<strong>in</strong>g error<br />

4<br />

3<br />

2<br />

1<br />

0<br />

FastICA JADE Extended Infomax<br />

Figure 1.8: Apply<strong>in</strong>g ICA to a random vector x = As that does not fulfill the ICA model; here<br />

s is chosen to consist of a two-dimensional and a one-dimensional irreducible component. Shown<br />

are the statistics over 100 runs of the Amari error of the random orig<strong>in</strong>al and the reconstructed<br />

mix<strong>in</strong>g matrix us<strong>in</strong>g the three ICA-algorithms FastICA, JADE and Extended Infomax. Clearly,<br />

the orig<strong>in</strong>al mix<strong>in</strong>g matrix could not be reconstructed <strong>in</strong> any of the experiments. However,<br />

<strong>in</strong>terest<strong>in</strong>gly, the latter two algorithms do <strong>in</strong>deed f<strong>in</strong>d an ISA up to permutation, which can be<br />

expla<strong>in</strong>ed by theorem 1.3.2.<br />

The sources s(t) are assumed to be k-<strong>in</strong>dependent, so ps factorizes <strong>in</strong>to r groups depend<strong>in</strong>g on<br />

k separate variables each. Thus ln ps is a sum of functions depend<strong>in</strong>g on k separate variables<br />

hence Hln ps (s0) is k-block-diagonal. Hessian ISA now simply uses the block-diagonality structure<br />

from equation (1.11) and performs JBD of estimates of a set of Hessians Hln ps (si) evaluated at<br />

different sampl<strong>in</strong>g po<strong>in</strong>ts si. This corresponds to us<strong>in</strong>g the HessianICA source condition from<br />

table 1.1. Other source conditions such as contracted quadricovariance matrices (Cardoso and<br />

Souloumiac, 1993) can also be used <strong>in</strong> this extended framework (Theis, 2007).<br />

Unknown group structure—general ISA<br />

A serious drawback of k-ISA (and hence of ICA) lies <strong>in</strong> the fact that the requirement fixed<br />

group-size k does not allow us to apply this analysis to an arbitrary random vector. Indeed,<br />

theoretically speak<strong>in</strong>g, it may only be applied to random vectors follow<strong>in</strong>g the k-ISA bl<strong>in</strong>d<br />

source separation model, which means that they have to be mixtures of a random vector that<br />

consists of <strong>in</strong>dependent groups of size k. If this is the case, uniqueness up to permutation and<br />

scal<strong>in</strong>g holds accord<strong>in</strong>g to theorem 1.3.1. However if k-ISA is applied to any random vector,<br />

a decomposition <strong>in</strong>to groups that are only ‘as <strong>in</strong>dependent as possible’ cannot be unique and<br />

depends on the contrast and the algorithm. In the literature, ICA is often applied to f<strong>in</strong>d<br />

representations fulfill<strong>in</strong>g the <strong>in</strong>dependence condition only as well as possible, however care has<br />

to be taken; the strong uniqueness result is not valid any more, and the results may depend on<br />

the algorithm as illustrated <strong>in</strong> figure 1.8.<br />

In contrast to ICA and k-ISA, we do not want to fix the size of the groups Si <strong>in</strong> advance.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!