14.02.2013 Views

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

Mathematics in Independent Component Analysis

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

160 Chapter 10. IEEE TNN 16(4):992-996, 2005<br />

996 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 16, NO. 4, JULY 2005<br />

the excellent performance of the another two our algorithms <strong>in</strong> the un- can be described by a l<strong>in</strong>ear weighted sum of the <strong>in</strong>puts, followed by<br />

derdeterm<strong>in</strong>ed BSS problem, for separation of artificially created sig- some nonl<strong>in</strong>ear transfer function [1], [2]. In this letter, however, the<br />

nals with sufficient level of sparseness.<br />

neural comput<strong>in</strong>g models studied are based on artificial neurons which<br />

often have b<strong>in</strong>ary <strong>in</strong>puts and outputs, and no adjustable weight between<br />

REFERENCES<br />

nodes. Neuron functions are stored <strong>in</strong> lookup tables, which can be<br />

[1] F. Abrard, Y. Deville, and P. White, “From bl<strong>in</strong>d source separation to<br />

bl<strong>in</strong>d source cancellation <strong>in</strong> the underdeterm<strong>in</strong>ed case: A new approach<br />

based on time-frequency analysis,” <strong>in</strong> Proc. 3rd Int. Conf. <strong>Independent</strong><br />

implemented us<strong>in</strong>g commercially available random access memories<br />

(RAMs). These systems and the nodes that they are composed of will<br />

be described, respectively, as weightless neural networks (WNNs) and<br />

<strong>Component</strong> <strong>Analysis</strong> and Signal Separation (ICA’2001), San Diego, CA, weightless nodes [1], [2]. They differ from other models, such as the<br />

Dec. 9–13, 2001, pp. 734–739.<br />

[2] A. Cichocki and S. Amari, Adaptive Bl<strong>in</strong>d Signal and Image Process<strong>in</strong>g.<br />

New York: Wiley, 2002.<br />

[3] S. Chen, D. Donoho, and M. Saunders, “Atomic decomposition by basis<br />

pursuit,” SIAM J. Sci. Comput., vol. 20, no. 1, pp. 33–61, 1998.<br />

weighted neural networks, whose tra<strong>in</strong><strong>in</strong>g is accomplished by means<br />

of adjustments of weights. In the literature, the terms “RAM-based”<br />

and “N-tuple based” have been used to refer to WNNs.<br />

In this letter, the computability (computational power) of a class of<br />

[4] D. Donoho and M. Elad, “Optimally sparse representation <strong>in</strong> general<br />

(nonorthogonal) dictionaries via � m<strong>in</strong>imization,” Proc. Nat. Acad. Sci.,<br />

vol. 100, no. 5, pp. 2197–2202, 2003.<br />

[5] A. Hyvär<strong>in</strong>en, J. Karhunen, and E. Oja, <strong>Independent</strong> <strong>Component</strong> <strong>Analysis</strong>.<br />

New York: Wiley, 2001.<br />

WNNs, called general s<strong>in</strong>gle-layer sequential weightless neural networks<br />

(GSSWNNs), is <strong>in</strong>vestigated. Such a class is an important representative<br />

of the research on temporal pattern process<strong>in</strong>g <strong>in</strong> (WNNs)<br />

[3]–[8]. As oneof thecontributions, an algorithm (constructiveproof)<br />

[6] A. Jourj<strong>in</strong>e, S. Rickard, and O. Yilmaz, “Bl<strong>in</strong>d separation of disjo<strong>in</strong>t to map any probabilistic automaton (PA) <strong>in</strong>to a GSSWNN is presented.<br />

orthogonal signals: Demix<strong>in</strong>g N sources from 2 mixtures,” <strong>in</strong> Proc. 2000<br />

IEEE Conf. Acoustics, Speech, and Signal Process<strong>in</strong>g (ICASSP’00), vol.<br />

5, Istanbul, Turkey, Jun. 2000, pp. 2985–2988.<br />

[7] T.-W. Lee, M. S. Lewicki, M. Girolami, and T. J. Sejnowski, “Bl<strong>in</strong>d<br />

sourse separation of more sources than mixtures us<strong>in</strong>g overcomplete rep-<br />

In fact, the proposed method not only allows the construction of any<br />

PA, but also <strong>in</strong>creases the class of functions that can be computed by<br />

such networks. For <strong>in</strong>stance, at a theoretical level, these networks are<br />

not restricted to f<strong>in</strong>ite-state languages (regular languages) and can now<br />

resentaitons,” IEEE Signal Process. Lett., vol. 6, no. 4, pp. 87–90, 1999.<br />

[8] D. D. Lee and H. S. Seung, “Learn<strong>in</strong>g the parts of objects by nonnegative<br />

matrix factorization,” Nature, vol. 40, pp. 788–791, 1999.<br />

[9] F. J. Theis, E. W. Lang, and C. G. Puntonet, “A geometric algorithm for<br />

overcomplete l<strong>in</strong>ear ICA,” Neurocomput., vol. 56, pp. 381–398, 2004.<br />

[10] K. Waheed and F. Salem, “Algebraic overcomplete <strong>in</strong>dependent com-<br />

deal with some context-free languages. Practical motivations for <strong>in</strong>vestigat<strong>in</strong>g<br />

probabilistic automata and GSSWNNs arefound <strong>in</strong> their possible<br />

application to, among others th<strong>in</strong>gs, syntactic pattern recognition,<br />

multimodal search, and learn<strong>in</strong>g control [9].<br />

ponent analysis,” <strong>in</strong> Proc. Int. Conf. <strong>Independent</strong> <strong>Component</strong> <strong>Analysis</strong><br />

(ICA’03), Nara, Japan, pp. 1077–1082.<br />

II. DEFINITIONS<br />

[11] M. Zibulevsky and B. A. Pearlmutter, “Bl<strong>in</strong>d source separation by sparse<br />

decomposition <strong>in</strong> a signal dictionary,” Neural Comput., vol. 13, no. 4, pp.<br />

A. Probabilistic Automata<br />

863–882, 2001.<br />

Probabilistic automata are a generalization of ord<strong>in</strong>ary determ<strong>in</strong>istic<br />

f<strong>in</strong>itestateautomata (DFA) for which an <strong>in</strong>put symbol could takethe<br />

Equivalence Between RAM-Based Neural Networks and<br />

Probabilistic Automata<br />

automaton <strong>in</strong>to any of its states with a certa<strong>in</strong> probability [9].<br />

Def<strong>in</strong>ition 2.1: A PA is a 5-tuple e€ a@6Y Y rY �sYpA, where<br />

• 6a�'IY'PY FFFY'�6�� is a f<strong>in</strong>ite set of ordered symbols called<br />

the <strong>in</strong>put alphabet;<br />

Marcilio C. P. de Souto, Teresa B. Ludermir, and Wilson R. de Oliveira<br />

•<br />

•<br />

a ��HY�PY FFFY�� r is a mapp<strong>in</strong>g of<br />

�� is a f<strong>in</strong>ite set of states;<br />

2 6 <strong>in</strong>to theset of � 2 � stochastic state<br />

transition matrices (where � is the number of states <strong>in</strong> ). The<br />

Abstract—In this letter, the computational power of a class of random <strong>in</strong>terpretation of r@—�AY—� P 6, can bestated as follows.<br />

access memory (RAM)-based neural networks, called general s<strong>in</strong>gle-layer<br />

sequential weightless neural networks (GSSWNNs), is analyzed. The theoretical<br />

results presented, besides help<strong>in</strong>g the understand<strong>in</strong>g of the temporal<br />

behavior of these networks, could also provide useful <strong>in</strong>sights for the develop<strong>in</strong>g<br />

of new learn<strong>in</strong>g algorithms.<br />

r@—�A a‘��� @—�A“, where ���@—�A ! H is theprobability of<br />

�<br />

enter<strong>in</strong>g state �� from state �� under <strong>in</strong>put —�, and ��� a<br />

�aI<br />

I, for all � aIYFFFY�. Thedoma<strong>in</strong> of r can be extended from<br />

6 to 6<br />

Index Terms—Automata theory, computability, random access<br />

memory (RAM) node, probabilistic automata, RAM-based neural networks,<br />

weightless neural networks (WNNs).<br />

I. INTRODUCTION<br />

The neuron model used <strong>in</strong> the great majority of work <strong>in</strong>volv<strong>in</strong>g<br />

neural networks is related to variations of the McCulloch–Pitts neuron,<br />

which will becalled theweighted neuron. A typical weighted neuron<br />

Manuscript received July 6, 2003; revised October 26, 2004.<br />

M. C. P. de Souto is with the Department of Informatics and Applied <strong>Mathematics</strong>,<br />

Federal University of Rio Grande do Norte, Campus Universitario,<br />

Natal-RN 59072-970, Brazil (e-mail: marcilio@dimap.ufrn.br).<br />

T. B. Ludermir is with the Center of Informatics, Federal University of Pernambuco,<br />

Recife 50732-970, Brazil.<br />

W. R. de Oliveira is with the Department of Physics and <strong>Mathematics</strong>, Rural<br />

Federal University of Pernambuco, Recife 52171-900, Brazil.<br />

Digital Object Identifier 10.1109/TNN.2005.849838<br />

3 by def<strong>in</strong><strong>in</strong>g the follow<strong>in</strong>g:<br />

1)<br />

2)<br />

r@ Aas�, where is theempty str<strong>in</strong>g and s� is an � 2 �<br />

identity matrix;<br />

r@—� Y—� Y FFFY—� A a<br />

r@—� Ar@—� AY FFFYr@—� A, where � ! P and<br />

•<br />

—�<br />

�s P<br />

P 6Y� aIYFFFY�. is the<strong>in</strong>itial state<strong>in</strong> which themach<strong>in</strong>eis found before<br />

the first symbol of the <strong>in</strong>put str<strong>in</strong>g is processed;<br />

• p is the set of f<strong>in</strong>al states @p A.<br />

The language accepted by a PA e€ is „ @e€ Aa�@3Y �@3AA�3 P<br />

6 3 Y�@3A a%Hr@3A%p b H� where: 1) %H is a �-dimensional row<br />

vector, <strong>in</strong> which the �th component is equal to one if �� a �s, and 0<br />

otherwise and 2) %p is an �-dimensional column vector, <strong>in</strong> which the<br />

�th component is equal to 1 if �� P p and 0 otherwise.<br />

The language accepted by e€ with cut-po<strong>in</strong>t(threshold) !, such that<br />

H !`I,isv@e€ Y!Aa�3�3 P 6 3<br />

and %Hr@3A%p b!�.<br />

Probabilistic automata recognize exactly the class of weighted regular<br />

languages (WRLs) [9]. Such a class of languages <strong>in</strong>cludes properly<br />

1045-9227/$20.00 © 2005 IEEE

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!