14.02.2013 Views

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

176 Chapter 12. LNCS 3195:718-725, 2004<br />

Postnonl<strong>in</strong>ear overcomplete bl<strong>in</strong>d source<br />

separation us<strong>in</strong>g sparse sources<br />

Fabian J. Theis 1,2 and Shun-ichi Amari 1<br />

1 Bra<strong>in</strong> Science Institute, RIKEN<br />

2-1, Hirosawa, Wako-shi, Saitama, 351-0198, Japan<br />

2 Institute of Biophysics, University of Regensburg<br />

D-93040 Regensburg, Germany<br />

fabian@theis.name,amari@bra<strong>in</strong>.riken.go.jp<br />

Abstract. We present an approach for bl<strong>in</strong>dly decompos<strong>in</strong>g an observed<br />

random vector x <strong>in</strong>to f(As) where f is a diagonal function i.e.<br />

f = f1 × . . . × fm with one-dimensional functions fi and A an m × n<br />

matrix. This postnonl<strong>in</strong>ear model is allowed to be overcomplete, which<br />

means that less observations than sources (m < n) are given. In contrast<br />

to <strong>Independent</strong> <strong>Component</strong> <strong>Analysis</strong> (ICA) we do not assume the sources<br />

s to be <strong>in</strong>dependent but to be sparse <strong>in</strong> the sense that at each time <strong>in</strong>stant<br />

they have at most m − 1 non-zero components (Sparse <strong>Component</strong><br />

<strong>Analysis</strong> or SCA). Identifiability of the model is shown, and an algorithm<br />

for model and source recovery is proposed. It first detects the postnonl<strong>in</strong>earities<br />

<strong>in</strong> each component, and then identifies the now l<strong>in</strong>earized model<br />

us<strong>in</strong>g previous results.<br />

Bl<strong>in</strong>d source separation (BSS) based on ICA is a rapidly grow<strong>in</strong>g field (see<br />

for <strong>in</strong>stance [1,2] and references there<strong>in</strong>), but most algorithms deal only with the<br />

case of at least as many observations as sources. However, there is an <strong>in</strong>creas<strong>in</strong>g<br />

<strong>in</strong>terest <strong>in</strong> (l<strong>in</strong>ear) overcomplete ICA [3–5], where matrix identifiability is known<br />

[6], but source identifiability does not hold. In order to approximatively detect<br />

the sources [7], additional requirements have to be made, usually sparsity of the<br />

sources.<br />

Recently, we have proposed a model based only upon the sparsity assumption<br />

(summarized <strong>in</strong> section 1) [8]. In this case identifiability of both matrix and<br />

sources given sufficiently high sparsity can be shown. Here, we extend these results<br />

to postnonl<strong>in</strong>ear mixtures (section 2); they describe a model often occurr<strong>in</strong>g<br />

<strong>in</strong> real situations, when the mixture is <strong>in</strong> pr<strong>in</strong>ciple l<strong>in</strong>ear, but the sensors <strong>in</strong>troduce<br />

an additional nonl<strong>in</strong>earity dur<strong>in</strong>g the record<strong>in</strong>g [9]. Section 3 presents an<br />

algorithm for identify<strong>in</strong>g such models, and section 4 f<strong>in</strong>ishes with an illustrative<br />

simulation.<br />

1 L<strong>in</strong>ear overcomplete SCA<br />

Def<strong>in</strong>ition 1. A vector v ∈ R n is said to be k-sparse if v has at most k non-zero<br />

entries.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!