06.03.2013 Views

Artificial Intelligence and Soft Computing: Behavioral ... - Arteimi.info

Artificial Intelligence and Soft Computing: Behavioral ... - Arteimi.info

Artificial Intelligence and Soft Computing: Behavioral ... - Arteimi.info

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Trapping at local minima: Back-propagation adjusts the weights to reach<br />

the minima (vide fig. 14.8) of the error function (of weights). However, the<br />

network can be trapped in local minima. This problem, however, can be solved<br />

by adding momentum to the training rule or by statistical training methods<br />

applied over the back-propagation algorithm [26].<br />

Weight 2<br />

Error<br />

function<br />

of<br />

weights<br />

local<br />

minima1<br />

Local<br />

minima2<br />

Weight 1<br />

Fig. 14.8:Valleys in error function cause back-propagation algorithm trapped at<br />

local minima.<br />

Adding momentum to the training rule: To eliminate the problem of<br />

trapping at local minima, recently a momentum term was added to the right<br />

h<strong>and</strong> side of the adaptation rule [17]. Formally,<br />

Wp,q,k(n+1) = Wp,q,k(n) + η δq, k Outp, j + α ∆ wp, q, k (n-1).<br />

The last term in the right h<strong>and</strong> side corresponds to momentum. The addition of<br />

the momentum term forces the system to continue moving in the same<br />

direction on the error surface, without trapping at local minima. A question<br />

may naturally arise: why call this momentum term? The answer came from the<br />

analogy of a rolling ball with high momentum passing over a narrow hole.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!