14.02.2013 Views

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

1.4. Sparseness 27<br />

Theorem 1.4.3 (SCA conditions). Assume that m ≤ n ≤ T and the matrix X ∈ R m×T satisfies<br />

the follow<strong>in</strong>g conditions:<br />

� �<br />

n<br />

(i) the columns of X lie <strong>in</strong> the union H of m−1 different hyperplanes, each column lies <strong>in</strong><br />

only one such hyperplane, each hyperplane conta<strong>in</strong>s at least m columns of X such that each<br />

m − 1 of them are l<strong>in</strong>early <strong>in</strong>dependent.<br />

(ii) for each i ∈ {1, ..., n} there exist p =<br />

� �<br />

n−1<br />

m−2<br />

different hyperplanes {Hi,j} p<br />

j=1<br />

that their <strong>in</strong>tersection Li = � p<br />

k=1 Hi,j is one dimensional subspace.<br />

(iii) any m different Li span the whole R m .<br />

<strong>in</strong> H such<br />

Then the matrix X is uniquely representable (up to permutation and scal<strong>in</strong>g) as SCA, satisfy<strong>in</strong>g<br />

the conditions of theorem 1.4.1.<br />

Algorithms for SCA<br />

In Georgiev et al. (2004, 2005c), we also proposed an algorithm based on random sampl<strong>in</strong>g for<br />

reconstruct<strong>in</strong>g the mix<strong>in</strong>g matrix and the sources, however it could not easily be applied <strong>in</strong> noisy<br />

sett<strong>in</strong>gs and high dimensions due to the <strong>in</strong>volved comb<strong>in</strong>atorial searches. Therefore, we derived<br />

a novel, robust algorithm for SCA <strong>in</strong> Theis et al. (2007a), see chapter 11. The key idea was that<br />

if the sources are of sufficiently high sparsity, the mixtures are clustered along hyperplanes <strong>in</strong> the<br />

mixture space. Based on this condition, the mix<strong>in</strong>g matrix could be reconstructed; furthermore,<br />

this property turned out to be robust aga<strong>in</strong>st noise and outliers.<br />

The proposed algorithm employed a generalization of the Hough transform <strong>in</strong> order to detect<br />

the hyperplanes <strong>in</strong> the mixture space, see figure 1.12. This leads to an algorithmically robust<br />

matrix and source identification. The Hough-based hyperplane estimation does not depend on<br />

the source dimension n, only on the mixture dimension m. With respect to applications, this<br />

implies that n can be quite large and hyperplanes will still be found if the grid resolution used<br />

<strong>in</strong> the Hough transform is sufficiently high. Increase of the grid resolution (<strong>in</strong> polynomial time)<br />

results <strong>in</strong> <strong>in</strong>creased accuracy also for higher source dimensions n.<br />

For applications of the proposed SCA algorithms <strong>in</strong> signal process<strong>in</strong>g and biomedical data<br />

analysis, we refer to section 1.6.3 and Georgiev et al. (2006, 2005a,b), Theis et al. (2007a). More<br />

elaborate source reconstruction methods, after know<strong>in</strong>g the mix<strong>in</strong>g matrix A were discussed <strong>in</strong><br />

Theis et al. (2004a).<br />

Postnonl<strong>in</strong>ear generalization<br />

In Theis and Amari (2004), see chapter 12, we considered the generalization of SCA to postnonl<strong>in</strong>ear<br />

mixtures, see section 1.2.2. As before the data x(t) = f(As(t)) is assumed to be<br />

l<strong>in</strong>early mixed followed by a componentwise nonl<strong>in</strong>earity, see equation (1.3). However, now the<br />

(m × n)-matrix A is allowed to be ‘wide’ i.e. the more complicated overcomplete situation is<br />

treated and m < n. By us<strong>in</strong>g sparseness of s(t), we were still able to recover the system:

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!