12.07.2015 Views

Neural Networks - Algorithms, Applications,and ... - Csbdu.in

Neural Networks - Algorithms, Applications,and ... - Csbdu.in

Neural Networks - Algorithms, Applications,and ... - Csbdu.in

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

94 BackpropagationFigure 3.3 serves as the reference for most of the discussion. The BPN is alayered, feedforward network that is fully <strong>in</strong>terconnected by layers. Thus, thereare no feedback connections <strong>and</strong> no connections that bypass one layer to godirectly to a later layer. Although only three layers are used <strong>in</strong> the discussion,more than one hidden layer is permissible.A neural network is called a mapp<strong>in</strong>g network if it is able to computesome functional relationship between its <strong>in</strong>put <strong>and</strong> its output. For example, ifthe <strong>in</strong>put to a network is the value of an angle, <strong>and</strong> the output is the cos<strong>in</strong>e ofthat angle, the network performs the mapp<strong>in</strong>g 9 —> cos(#). For such a simplefunction, we do not need a neural network; however, we might want to performa complicated mapp<strong>in</strong>g where we do not know how to describe the functionalrelationship <strong>in</strong> advance, but we do know of examples of the correct mapp<strong>in</strong>g.'PNFigure 3.3The three-layer BPN architecture follows closely the generalnetwork description given <strong>in</strong> Chapter 1. The bias weights, 0^ ,<strong>and</strong> Q° k , <strong>and</strong> the bias units are optional. The bias units providea fictitious <strong>in</strong>put value of 1 on a connection to the bias weight.We can then treat the bias weight (or simply, bias) like anyother weight: It contributes to the net-<strong>in</strong>put value to the unit,<strong>and</strong> it participates <strong>in</strong> the learn<strong>in</strong>g process like any other weight.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!