14.02.2013 Views

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

84 Chapter 3. Signal Process<strong>in</strong>g 84(5):951-956, 2004<br />

assume that each Aj and Bj is <strong>in</strong>vertible. Apply<strong>in</strong>g<br />

Theorem 3.2 shows the corollary.<br />

From this, a complex version of the Skitovitch–<br />

Darmois theorem can easily be derived:<br />

Corollary 3.4 (Complex S–D theorem). Let L1 �<br />

=<br />

n<br />

i=1 iXi and L2 = �n i=1 iXi with X1;:::;Xn <strong>in</strong>dependent<br />

complex random variables and j; j ∈ C for<br />

j =1;:::;n. If L1 and L2 are <strong>in</strong>dependent, then all Xj<br />

with j j �= 0 are Gaussian.<br />

Here, a complex random variable is said to be Gaussian<br />

if both real and imag<strong>in</strong>ary part are Gaussians.<br />

Proof. We can <strong>in</strong>terpret the n <strong>in</strong>dependent complex<br />

random variables Xi as n two dimensional real random<br />

vectors that are mutually <strong>in</strong>dependent. Multiplication<br />

by the complex numbers j either ( j �= 0)isamultiplication<br />

by the real <strong>in</strong>vertible matrix<br />

� �<br />

Re( j) −Im( j)<br />

Im( j) Re( j)<br />

or ( j = 0) multiplication by the 0-matrix, similar for<br />

j. Apply<strong>in</strong>g Corollary 3.3 nishes the proof.<br />

4. Indeterm<strong>in</strong>acies of complex ICA<br />

Given a complex n-dimensional random vector X,<br />

a matrix W ∈ Gl(n; C) is called (complex) ICA of X<br />

if WX is <strong>in</strong>dependent (as a complex random vector).<br />

We will show that W and V are complex ICAs of X if<br />

and only if W −1 ∼ V −1 that is if they di er by right<br />

multiplication by a complex scal<strong>in</strong>g and permutation<br />

matrix. This is equivalent to calculat<strong>in</strong>g the <strong>in</strong>determ<strong>in</strong>acies<br />

of the complex BSS model:<br />

Consider the noiseless complex l<strong>in</strong>ear <strong>in</strong>stantaneous<br />

bl<strong>in</strong>d source separation (BSS) model with as many<br />

sources as sensors<br />

X = AS: (1)<br />

Here S is an <strong>in</strong>dependent complex-valued ndimensional<br />

random vector and A ∈ Gl(n; C) an<strong>in</strong>vertible<br />

complex matrix.<br />

The task of l<strong>in</strong>ear BSS is to nd A and S given<br />

only X. An obvious <strong>in</strong>determ<strong>in</strong>acy of this problem is<br />

F.J. Theis / Signal Process<strong>in</strong>g 84 (2004) 951 – 956 953<br />

that A can be found only up to equivalence because<br />

for scal<strong>in</strong>g L and permutation matrix P<br />

X = ALPP −1 L −1 S<br />

and P −1 L −1 S is also <strong>in</strong>dependent. We will show that<br />

under mild assumptions to S there are no further <strong>in</strong>determ<strong>in</strong>acies<br />

of complex BSS.<br />

Various algorithms for solv<strong>in</strong>g the complex BSS<br />

problem have been proposed [1,2,7,13,16]. We want<br />

to note that many cases where complex BSS is applied<br />

can <strong>in</strong> fact be reduced to us<strong>in</strong>g real BSS algorithms.<br />

This is the case if either the sources or the<br />

mix<strong>in</strong>g matrix are real. The latter for example occurs<br />

after Fourier transformation of signals with time<br />

structure.<br />

If the sources are real, then the above complex<br />

model can be split up <strong>in</strong>to two separate real BSS problems:<br />

Re(X) = Re(A)S;<br />

Im(X)=Im(A)S:<br />

Solv<strong>in</strong>g both of these real BSS equations yields A =<br />

Re(A) +i Im(A). Of course, Re(A) and Im(A) can<br />

only be found except for scal<strong>in</strong>g and permutation. By<br />

compar<strong>in</strong>g the two recovered source random vectors<br />

(us<strong>in</strong>g for example mutual <strong>in</strong>formation of one component<br />

of each vector), we can however assume that the<br />

permutation and then also the scal<strong>in</strong>g <strong>in</strong>determ<strong>in</strong>acy<br />

of both recovered matrices is the same, which allows<br />

the algorithm to correctly put A back together. Similarly,<br />

also separability of this special complex ICA<br />

problem can be derived from the well-known separability<br />

results <strong>in</strong> the real case.<br />

If the mix<strong>in</strong>g matrix is known to be real, then aga<strong>in</strong><br />

splitt<strong>in</strong>g up Eq. (1) <strong>in</strong>to real and complex parts yields<br />

Re(X)=A Re(S);<br />

Im(X)=A Im(S):<br />

A can be found from either equation. If both real and<br />

complex samples are to be used <strong>in</strong> order to <strong>in</strong>crease<br />

precision, those can simply be concatenated <strong>in</strong> order<br />

to generate a twice as large sample set mixed by the<br />

same mix<strong>in</strong>g matrix A. In terms of random vectors this<br />

means work<strong>in</strong>g <strong>in</strong> two disjo<strong>in</strong>t copies of the orig<strong>in</strong>al<br />

probability space. Aga<strong>in</strong> separability follows.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!