14.02.2013 Views

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

56 Chapter 2. Neural Computation 16:1827-1850, 2004<br />

LETTER Communicated by Aapo Hyvar<strong>in</strong>en<br />

A New Concept for Separability Problems <strong>in</strong> Bl<strong>in</strong>d Source<br />

Separation<br />

Fabian J. Theis<br />

fabian@theis.name<br />

Institute of Biophysics, University of Regensburg, 93040 Regensburg, Germany<br />

The goal of bl<strong>in</strong>d source separation (BSS) lies <strong>in</strong> recover<strong>in</strong>g the orig<strong>in</strong>al<br />

<strong>in</strong>dependent sources of a mixed random vector without know<strong>in</strong>g the<br />

mix<strong>in</strong>g structure. A key <strong>in</strong>gredient for perform<strong>in</strong>g BSS successfully is to<br />

know the <strong>in</strong>determ<strong>in</strong>acies of the problem—that is, to know how the separat<strong>in</strong>g<br />

model relates to the orig<strong>in</strong>al mix<strong>in</strong>g model (separability). For l<strong>in</strong>ear<br />

BSS, Comon (1994) showed us<strong>in</strong>g the Darmois-Skitovitch theorem that<br />

the l<strong>in</strong>ear mix<strong>in</strong>g matrix can be found except for permutation and scal<strong>in</strong>g.<br />

In this work, a much simpler, direct proof for l<strong>in</strong>ear separability is given.<br />

The idea is based on the fact that a random vector is <strong>in</strong>dependent if and<br />

only if the Hessian of its logarithmic density (resp. characteristic function)<br />

is diagonal everywhere. This property is then exploited to propose a new<br />

algorithm for perform<strong>in</strong>g BSS. Furthermore, first ideas of how to generalize<br />

separability results based on Hessian diagonalization to more complicated<br />

nonl<strong>in</strong>ear models are studied <strong>in</strong> the sett<strong>in</strong>g of postnonl<strong>in</strong>ear BSS.<br />

1 Introduction<br />

In <strong>in</strong>dependent component analysis (ICA), one tries to f<strong>in</strong>d statistically<br />

<strong>in</strong>dependent data with<strong>in</strong> a given random vector. An application of ICA<br />

lies <strong>in</strong> bl<strong>in</strong>d source separation (BSS), where it is furthermore assumed that<br />

the given vector has been mixed us<strong>in</strong>g a fixed set of <strong>in</strong>dependent sources.<br />

The advantage of apply<strong>in</strong>g ICA algorithms to BSS problems <strong>in</strong> contrast to<br />

correlation-based algorithms is that ICA tries to make the output signals as<br />

<strong>in</strong>dependent as possible by also <strong>in</strong>clud<strong>in</strong>g higher-order statistics.<br />

S<strong>in</strong>ce the <strong>in</strong>troduction of <strong>in</strong>dependent component analysis by Hérault<br />

and Jutten (1986), various algorithms have been proposed to solve the BSS<br />

problem (Comon, 1994; Bell & Sejnowski, 1995; Hyvär<strong>in</strong>en & Oja, 1997;<br />

Theis, Jung, Puntonet, & Lang, 2002). Good textbook-level <strong>in</strong>troductions to<br />

ICA are given <strong>in</strong> Hyvär<strong>in</strong>en, Karhunen, and Oja (2001) and Cichocki and<br />

Amari (2002).<br />

Separability of l<strong>in</strong>ear BSS states that under weak conditions to the sources,<br />

the mix<strong>in</strong>g matrix is determ<strong>in</strong>ed uniquely by the mixtures except for permutation<br />

and scal<strong>in</strong>g, as showed by Comon (1994) us<strong>in</strong>g the Darmois-<br />

Skitovitch theorem. We propose a direct proof based on the concept of<br />

Neural Computation 16, 1827–1850 (2004) c○ 2004 Massachusetts Institute of Technology

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!