12.07.2015 Views

Neural Networks - Algorithms, Applications,and ... - Csbdu.in

Neural Networks - Algorithms, Applications,and ... - Csbdu.in

Neural Networks - Algorithms, Applications,and ... - Csbdu.in

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

8.2 ART1 315In this case, the equilibrium activities are X = (-0.25, -0.25, -0.087. -0.25,0.051)' with only one positive activity. The new output pattern is (0,0,0,0, 1)',which exactly matches the <strong>in</strong>put pattern, so no reset occurs.Even though unit 2 on F 2 had previously encoded an <strong>in</strong>put pattern, it getsreceded now to match the new <strong>in</strong>put pattern that is a subset of the orig<strong>in</strong>alpattern. The new weight matrices appear as follows. For F\,0 0 0.756 0.756 0.756 0.7560 0 0.756 0.756 0.756 0.7560 0 0.756 0.756 0.756 0.7561 0 0.756 0.756 0.756 0.7560 1 0.756 0.756 0.756 0.756For F 2 ,0 0 0 1 00 0 0 0 10.329 0.329 0.329 0.329 0.3290.329 0.329 0.329 0.329 0.3290.329 0.329 0.329 0.329 0.3290.329 0.329 0.329 0.329 0.329If we return to the superset vector, (0,0,1,0,1)', the <strong>in</strong>itial forward propagationto F 2 yields activities of (0.000, 1.000,0.657,0.657,0.657,0.657)*, sounit 2 w<strong>in</strong>s aga<strong>in</strong>. Go<strong>in</strong>g back to F\, V = (0,0,0,0,1)*, <strong>and</strong> the equilibriumactivities are (-0.25, -0.25, -0.071, -0.25,0.051)'. The new outputs are(0,0,0,0,1)'. This time, we get a reset signal, s<strong>in</strong>ce |S|/|I 2 = 0.5 < p. Thus,unit 2 on F 2 is removed from competition, <strong>and</strong> the match<strong>in</strong>g cycle is repeatedwith the orig<strong>in</strong>al <strong>in</strong>put vector restored on F\.Propagat<strong>in</strong>g forward a second time results <strong>in</strong> activities on F 2 of (0.000,0.000,0.657, 0.657,0.657,0.657)*, where we have forced unit 2's activity tozero as a result of susta<strong>in</strong>ed <strong>in</strong>hibition from the orient<strong>in</strong>g subsystem. We chooseunit 3 as the w<strong>in</strong>ner this time, <strong>and</strong> it codes the <strong>in</strong>put vector. On subsequentpresentations of subset <strong>and</strong> superset vectors, each will access the appropriate F 2unit directly without the need of a search. This result can be verified by directcalculation with the example presented <strong>in</strong> this section.Exercise 8.6: Verify the statement made <strong>in</strong> the previous paragraph that thepresentation of (0,0,1,0, 1)' <strong>and</strong> (0,0,0,0,1)' result <strong>in</strong> immediate access tothe correspond<strong>in</strong>g nodes on F 2 without reset by perform<strong>in</strong>g the appropriatecalculations.Exercise 8.7: What does the weight matrix on F 2 look like after unit 3 encodesthe superset vector given <strong>in</strong> the example <strong>in</strong> this section?Exercise 8.8: What do you expect will happen if we apply (0, 1, 1,0, 1)' to theexample network? Note that the pattern is a superset to one already encoded.Verify your hypothesis by direct calculation.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!