14.02.2013 Views

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

1.3. Dependent component analysis 19<br />

together with the transformation properties (1.6) of the source conditions yields<br />

Ci( t S) = s S †⊤ Ci(X) s S † and Ci( s S) = t S †⊤ Ci(X ⊤ ) t S †<br />

because ∗ m ≥ n and hence ∗ S ∗ S † = I. By assumption the matrices Ci( ∗ S) are as diagonal as<br />

possible. In order to separate the data, we had to f<strong>in</strong>d diagonalizers for both Ci(X) and Ci(X ⊤ )<br />

such that they satisfy the spatiotemporal model (1.8). As the matrices derived from X had to<br />

be diagonalized <strong>in</strong> terms of both columns and rows, we denoted this by double-sided approximate<br />

jo<strong>in</strong>t diagonalization.<br />

In Theis et al. (2007b, 2005a) we showed how to reduce this process to jo<strong>in</strong>t diagonalization.<br />

In order to get robust estimates of the source conditions, dimension reduction was essential. For<br />

this we considered the s<strong>in</strong>gular value decomposition X, and formulated the algorithm <strong>in</strong> terms<br />

of the pseudo-orthogonal components of X. Of course, <strong>in</strong>stead of us<strong>in</strong>g autocovariance matrices,<br />

other source conditions Ci(.) from table 1.1 can be employed <strong>in</strong> order to adapt to the separation<br />

problem at hand.<br />

We present an application of the spatiotemporal BSS algorithm to fMRI data us<strong>in</strong>g multidimensional<br />

autocovariances <strong>in</strong> section 1.6.1.<br />

1.3.3 <strong>Independent</strong> subspace analysis<br />

Another extension of the simple source separation model lies <strong>in</strong> extract<strong>in</strong>g groups of sources<br />

that are <strong>in</strong>dependent of each other, but not with<strong>in</strong> the group. So, multidimensional <strong>in</strong>dependent<br />

component analysis or <strong>in</strong>dependent subspace analysis (ISA) denotes the task of transform<strong>in</strong>g a<br />

multivariate observed sensor signal such that groups of the transformed signal components are<br />

mutually <strong>in</strong>dependent—however dependencies with<strong>in</strong> the groups are still allowed. This allows<br />

for weaken<strong>in</strong>g the sometimes too strict assumption of <strong>in</strong>dependence <strong>in</strong> ICA, and has potential<br />

applications <strong>in</strong> various fields such as ECG, fMRI analysis or convolutive ICA.<br />

Recently we were able to calculate the <strong>in</strong>determ<strong>in</strong>acies of group ICA for known and unknown<br />

group structure, which f<strong>in</strong>ally enabled us to guarantee successful application of group ICA to<br />

BSS problems. Here, we will shortly review the identifiability result as well as the result<strong>in</strong>g<br />

algorithm for separat<strong>in</strong>g signals <strong>in</strong>to groups of dependent signals. As before, the algorithm is<br />

based on jo<strong>in</strong>t (block) diagonalization of sets of matrices generated us<strong>in</strong>g one or multiple source<br />

conditions.<br />

Generalizations of the ICA model that are to <strong>in</strong>clude dependencies of multiple one-dimensional<br />

components have been studied for quite some time. ISA <strong>in</strong> the term<strong>in</strong>ology of multidimensional<br />

ICA has first been <strong>in</strong>troduced by Cardoso (1998) us<strong>in</strong>g geometrical motivations. His model<br />

as well as the related but <strong>in</strong>dependently proposed factorization of multivariate function classes<br />

(L<strong>in</strong>, 1998) are quite general, however no identifiability results were presented, and applicability<br />

to an arbitrary random vector was unclear. Later, <strong>in</strong> the special case of equal group sizes k,<br />

<strong>in</strong> the follow<strong>in</strong>g denoted as k-ISA, uniqueness results have been extended from the ICA theory<br />

(Theis, 2004b). Algorithmic enhancements <strong>in</strong> this sett<strong>in</strong>g have been recently studied by Poczos<br />

and Lör<strong>in</strong>cz (2005). Similar to (Cardoso, 1998), Akaho et al. (1999) also proposed to employ<br />

a multidimensional component maximum likelihood algorithm, however <strong>in</strong> the slightly different<br />

context of multimodal component analysis. Moreover, if the observations conta<strong>in</strong> additional<br />

(1.9)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!