14.02.2013 Views

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

144 Chapter 9. Neurocomput<strong>in</strong>g (<strong>in</strong> press), 2007<br />

So both source separation models can be <strong>in</strong>terpreted as matrix factorization<br />

problems; <strong>in</strong> the temporal case restrictions such as diagonal autocorrelations<br />

are determ<strong>in</strong>ed by the second factor, <strong>in</strong> the spatial case by the first one.<br />

In order to achieve a spatiotemporal model, we require these conditions from<br />

both factors at the same time. In other words, <strong>in</strong>stead of recover<strong>in</strong>g a s<strong>in</strong>gle<br />

source data set which fulfills the source conditions spatiotemporally we try<br />

to f<strong>in</strong>d two source matrices, a spatial and a temporal source matrix, and the<br />

conditions are put onto the matrices separately. So the spatiotemporal BSS<br />

model can be derived from equation (2) as the factorization problem<br />

X = s S ⊤t S (3)<br />

with spatial source matrix s S and temporal source matrix t S, which both<br />

have (multidimensional) autocorrelations be<strong>in</strong>g as diagonal as possible. Diagonality<br />

of the autocorrelations is <strong>in</strong>variant under scal<strong>in</strong>g and permutation,<br />

so the above model conta<strong>in</strong>s these <strong>in</strong>determ<strong>in</strong>acies — <strong>in</strong>deed the spatial and<br />

temporal sources can <strong>in</strong>terchange scal<strong>in</strong>g (L) and permutation (P) matrices,<br />

s S ⊤t S = (L −1 P −1s S) ⊤ (LP t S), and the model assumptions still hold. The spatiotemporal<br />

BSS problem as def<strong>in</strong>ed <strong>in</strong> equation (3) has been implicitly proposed<br />

<strong>in</strong> [9], equation (5), <strong>in</strong> comb<strong>in</strong>ation with a dimension reduction scheme.<br />

Here, we first operate on the general model and derive the cost function based<br />

on autodecorrelation, and only later comb<strong>in</strong>e this with a dimension reduction<br />

method.<br />

5 Algorithmic spatiotemporal BSS<br />

Stone et al. [9] first proposed the model from equation (3), where a jo<strong>in</strong>t energy<br />

function is employed based on mutual entropy and Infomax. Apart from<br />

the many parameters used <strong>in</strong> the algorithm, the <strong>in</strong>volved gradient descent<br />

optimization is susceptible to noise, local m<strong>in</strong>ima and <strong>in</strong>appropriate <strong>in</strong>itializations,<br />

so we propose a novel, more robust algebraic approach <strong>in</strong> the follow<strong>in</strong>g.<br />

It is based on the jo<strong>in</strong>t diagonalization of source conditions posed not only<br />

temporally but also spatially at the same time.<br />

5.1 Spatiotemporal BSS us<strong>in</strong>g jo<strong>in</strong>t diagonalization<br />

Shift<strong>in</strong>g to matrix notation, we <strong>in</strong>terpret ¯ Rk(X) := ¯ Rk( t x(t)) as a symmetrized<br />

temporal autocorrelation matrix, whereas ¯ Rk(X ⊤ ) := ¯ Rk( s x(r)) denotes<br />

the correspond<strong>in</strong>g spatial possibly multidimensional symmetrized autocorrelation<br />

matrix. Here k <strong>in</strong>dexes the one- or multidimensional lags τ. Ap-<br />

5

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!