14.02.2013 Views

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

234 Chapter 16. Proc. ICA 2006, pages 917-925<br />

for all xN∈R, because A is <strong>in</strong>vertible.<br />

Now aNN� 0, otherwise XN= aNGS G which contradicts (ii) for X. If also aNG� 0,<br />

then by equation (3), h ′′ N is constant and therefore S N Gaussian, which aga<strong>in</strong> contradicts<br />

(ii), now for S. Hence aNG= 0. By (3), aGNaGG= 0, and aga<strong>in</strong> aGG�0, otherwise<br />

XG= aGNS N contradict<strong>in</strong>g (ii) for S. Hence also aGN= 0 as was to show.<br />

General proof. In order to give an idea of the ma<strong>in</strong> proof without gett<strong>in</strong>g lost <strong>in</strong> details,<br />

we have divided it up <strong>in</strong>to a sequence of lemmas; these will not be proven due<br />

to lack of space. The characteristic function of the random vector X is def<strong>in</strong>ed by<br />

�X(x) := E(exp ix⊤X), and s<strong>in</strong>ce X is assumed to have exist<strong>in</strong>g covariance,�X is twice<br />

cont<strong>in</strong>uously differentiable. Moreover by def<strong>in</strong>ition�AS(x)=�S(A ⊤x), and the characteristic<br />

function of an <strong>in</strong>dependent random vector factorizes <strong>in</strong>to the component characteristic<br />

functions. So <strong>in</strong>stead of us<strong>in</strong>g pX as <strong>in</strong> the 2-dimensional example, we use�X,<br />

hav<strong>in</strong>g similar properties except for the fact that the range is now complex and that the<br />

differentiability condition can be considerably relaxed.<br />

We will need the follow<strong>in</strong>g lemma, which has essentially been shown <strong>in</strong> [9]; here<br />

∇ f denotes the gradient of f and H f its Hessian.<br />

Lemma 1. Let X∈ L2(Ω,R m ) be a random vector. Then X is Gaussian with covariance<br />

2C if and only if it satisfies�XH�X −∇�X(∇�X) ⊤ + C�X 2≡ 0.<br />

Note that we may assume that the covariance of S (and hence also of X) is positive<br />

def<strong>in</strong>ite — otherwise, while still keep<strong>in</strong>g the model, we can simply remove the subspace<br />

of determ<strong>in</strong>istic components (i.e. components of variance 0), which have to be mapped<br />

onto each other by A. Hence we may even assume Cov(SG)=I, after whiten<strong>in</strong>g as<br />

described <strong>in</strong> section 1.1. This uses the fact that the basis with<strong>in</strong> the Gaussian subspace<br />

is not unique. The same holds also for the non-Gaussian subspace, so we may choose<br />

any BN∈ Gl(n) and BG∈ O(d−n) to get<br />

� �<br />

ANNBN<br />

X= (B<br />

AGNBN<br />

−1<br />

N SN)+<br />

� �<br />

ANGBG<br />

(B<br />

AGGBG<br />

⊤ GSG). (4)<br />

Here only orthogonal matrices BG are allowed <strong>in</strong> order for B⊤ GSG to stay decorrelated,<br />

with SG be<strong>in</strong>g decorrelated.<br />

The next lemma uses the dimension reduction model for X and S to derive an explicit<br />

differential equation for ˆSN. The Gaussian part ˆSG <strong>in</strong> the follow<strong>in</strong>g lemma vanishes<br />

after application of lemma 1.<br />

Lemma 2. For any basis BN∈ Gl(n), the non-Gaussian source characteristic function<br />

ˆSN∈C 2 (Rn ,C) fulfills<br />

�<br />

ANNBN<br />

ˆSNHˆSN −∇ˆSN(∇ˆSN) ⊤� B ⊤ NA⊤ GN + 2ANGA ⊤ GG ˆS 2 N≡ 0. (5)<br />

Lemma 3. Let (ANN, ANG)∈Mat(n×(n+(d−n))) be an arbitrary full rank matrix.<br />

If rank ANN < n, then we may choose coord<strong>in</strong>ates BN ∈ Gl(n), BG ∈ O(d−n) and<br />

M∈Gl(n) such that for arbitrary matrices∗∈Mat((n−1)×(n−1)),∗ ′ ∈ Mat((n−<br />

1)×(d− n−1)):<br />

MANNBN=<br />

0 0<br />

0∗<br />

and MANGBG= 1 0<br />

0∗ ′

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!