14.02.2013 Views

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

124 Chapter 7. Proc. ISCAS 2005, pages 5878-5881<br />

Bl<strong>in</strong>d signal separation <strong>in</strong>to groups of dependent<br />

signals us<strong>in</strong>g jo<strong>in</strong>t block diagonalization<br />

Abstract— Multidimensional or group <strong>in</strong>dependent component<br />

analysis describes the task of transform<strong>in</strong>g a multivariate observed<br />

sensor signal such that groups of the transformed signal<br />

components are mutually <strong>in</strong>dependent - however dependencies<br />

with<strong>in</strong> the groups are still allowed. This generalization of<br />

<strong>in</strong>dependent component analysis (ICA) allows for weaken<strong>in</strong>g<br />

the sometimes too strict assumption of <strong>in</strong>dependence <strong>in</strong> ICA.<br />

It has potential applications <strong>in</strong> various fields such as ECG,<br />

fMRI analysis or convolutive ICA. Recently we could calculate<br />

the <strong>in</strong>determ<strong>in</strong>acies of group ICA, which f<strong>in</strong>ally enables us,<br />

also theoretically, to apply group ICA to solve bl<strong>in</strong>d source<br />

separation (BSS) problems. In this paper we <strong>in</strong>troduce and<br />

discuss various algorithms for separat<strong>in</strong>g signals <strong>in</strong>to groups<br />

of dependent signals. The algorithms are based on jo<strong>in</strong>t block<br />

diagonalization of sets of matrices generated us<strong>in</strong>g several signal<br />

structures.<br />

Fabian J. Theis<br />

Institute of Biophysics, University of Regensburg<br />

93040 Regensburg, Germany, Email: fabian@theis.name<br />

of commut<strong>in</strong>g symmetric n × n matrices Mi, to f<strong>in</strong>d an<br />

orthogonal matrix E such that E ⊤ MiE is diagonal for all i.<br />

In the follow<strong>in</strong>g we will use a generalization of this technique<br />

as algorithm to solve MBSS problems. Instead of fully<br />

diagonaliz<strong>in</strong>g Mi <strong>in</strong> jo<strong>in</strong>t block diagonalization (JBD) we<br />

want to determ<strong>in</strong>e E such that E ⊤ MiE is block-diagonal<br />

(after fix<strong>in</strong>g the block-structure).<br />

Introduc<strong>in</strong>g some notation, let us def<strong>in</strong>e for r, s =1,...,n<br />

the (r, s) sub-k-matrix of W =(wij), denoted by W (k)<br />

rs ,to<br />

be the k × k submatrix of W end<strong>in</strong>g at position (rk, sk).<br />

Denote Gl(n) the group of <strong>in</strong>vertible n×n matrices. A matrix<br />

W ∈ Gl(nk) is said to be a k-scal<strong>in</strong>g matrix if W (k)<br />

rs =0<br />

for r �= s, andW is called a k-permutation matrix if for<br />

each r =1,...,n there exists precisely one s such that W (k)<br />

rs<br />

equals the k × k unit matrix.<br />

Hence, fix<strong>in</strong>g the block-size to k, JBD tries to f<strong>in</strong>d E<br />

such that E ⊤ MiE is a k-scal<strong>in</strong>g matrix. In practice due<br />

to estimation errors, such E will not exist, so we speak of<br />

approximate JBD and imply m<strong>in</strong>imiz<strong>in</strong>g some error-measure<br />

on non-block-diagonality.<br />

Various algorithms to actually perform JBD have been<br />

proposed, see [5] and references there<strong>in</strong>. In the follow<strong>in</strong>g we<br />

will simply perform jo<strong>in</strong>t diagonalization (us<strong>in</strong>g for example<br />

the Jacobi-like algorithm from [6]) and then permute the<br />

columns of E to achieve block-diagonality — <strong>in</strong> experiments<br />

this turns out to be an efficient solution to JBD [5].<br />

I. INTRODUCTION<br />

In this work, we discuss multidimensional bl<strong>in</strong>d source<br />

separation (MBSS) i.e. the recovery of underly<strong>in</strong>g sources<br />

s from an observed mixture x. As usual, s has to fulfill<br />

additional properties such as <strong>in</strong>dependence or diagonality of<br />

the autocovariances (if s possesses time structure). However<br />

<strong>in</strong> contrast to ord<strong>in</strong>ary BSS, MBSS is more general as some<br />

source signals are allowed to possess common statistics. One<br />

possible solution for MBSS is multidimensional <strong>in</strong>dependent<br />

component analysis (MICA) — <strong>in</strong> section IV we will discuss<br />

other such conditions. The idea MICA is that we do not require<br />

full <strong>in</strong>dependence of the transform y := Wx but only mutual<br />

<strong>in</strong>dependence of certa<strong>in</strong> tuples yi1 ,...,yi2 . If the size of all<br />

III. MULTIDIMENSIONAL ICA (MICA)<br />

tuples is restricted to one, this reduces to ord<strong>in</strong>ary ICA. In<br />

Let k, n ∈ N. We call an nk-dimensional random vec-<br />

general, of course the tuples could have different sizes, but<br />

tor y k-<strong>in</strong>dependent if the k-dimensional random vectors<br />

for the sake of simplicity we assume that they all have the<br />

(y1,...,yk)<br />

same length k.<br />

Multidimensional ICA has first been <strong>in</strong>troduced by Cardoso<br />

[1] us<strong>in</strong>g geometrical motivations. Hyvär<strong>in</strong>en and Hoyer then<br />

presented a special case of multidimensional ICA which they<br />

called <strong>in</strong>dependent subspace analysis [2]; there the dependence<br />

with<strong>in</strong> a k-tuple is explicitly modelled enabl<strong>in</strong>g the authors<br />

to propose better algorithms without hav<strong>in</strong>g to resort to the<br />

problematic multidimensional density estimation.<br />

II. JOINT BLOCK DIAGONALIZATION<br />

Jo<strong>in</strong>t diagonalization has become an important tool <strong>in</strong> ICAbased<br />

BSS (used for example <strong>in</strong> JADE [3]) or <strong>in</strong> BSS rely<strong>in</strong>g<br />

on second-order time-decorrelation (for example <strong>in</strong> SOBI<br />

[4]). The task of (real) jo<strong>in</strong>t diagonalization is, given a set<br />

⊤ ,...,(ynk−k+1,...,ynk) ⊤ are mutually <strong>in</strong>dependent.<br />

A matrix W ∈ Gl(nk) is called a k-multidimensional<br />

ICA of an nk-dimensional random vector x if Wx is k<strong>in</strong>dependent.<br />

If k =1, this is the same as ord<strong>in</strong>ary ICA.<br />

Us<strong>in</strong>g MICA we want to solve the (noiseless) l<strong>in</strong>ear MBSS<br />

problem x = As, where the nk-dimensional random vector x<br />

is given, and A ∈ Gl(nk) and s are unknown. In the case of<br />

MICA s isassumedtobek-<strong>in</strong>dependent.<br />

A. Indeterm<strong>in</strong>acies<br />

Obvious <strong>in</strong>determ<strong>in</strong>acies are, similar to ord<strong>in</strong>ary ICA, <strong>in</strong>vertible<br />

transforms <strong>in</strong> Gl(k) <strong>in</strong> each tuple as well as the fact<br />

that the order of the <strong>in</strong>dependent k-tuples is not fixed. Indeed,<br />

if A is MBSS solution, then so is ALP with a k-scal<strong>in</strong>g<br />

matrix L and a k-permutation P, because <strong>in</strong>dependence is

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!