12.07.2015 Views

Neural Networks - Algorithms, Applications,and ... - Csbdu.in

Neural Networks - Algorithms, Applications,and ... - Csbdu.in

Neural Networks - Algorithms, Applications,and ... - Csbdu.in

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

12 Introduction to ANS TechnologyThe neurotransmitters diffuse across the junction <strong>and</strong> jo<strong>in</strong> to the postsynapticmembrane at certa<strong>in</strong> receptor sites. The chemical action at the receptor sitesresults <strong>in</strong> changes <strong>in</strong> the permeability of the postsynaptic membrane to certa<strong>in</strong>ionic species. An <strong>in</strong>flux of positive species <strong>in</strong>to the cell will tend to depolarizethe rest<strong>in</strong>g potential; this effect is excitatory. If negative ions enter, ahyperpolarization effect occurs; this effect is <strong>in</strong>hibitory. Both effects are localeffects that spread a short distance <strong>in</strong>to the cell body <strong>and</strong> are summed at theaxon hillock. If the sum is greater than a certa<strong>in</strong> threshold, an action potentialis generated.1.1.3 <strong>Neural</strong> Circuits <strong>and</strong> ComputationFigure 1.8 illustrates several basic neural circuits that are found <strong>in</strong> the centralnervous system. Figures 1.8(a) <strong>and</strong> (b) illustrate the pr<strong>in</strong>ciples of divergence<strong>and</strong> convergence <strong>in</strong> neural circuitry. Each neuron sends impulses to many otherneurons (divergence), <strong>and</strong> receives impulses from many neurons (convergence).This simple idea appears to be the foundation for all activity <strong>in</strong> the centralnervous system, <strong>and</strong> forms the basis for most neural-network models that weshall discuss <strong>in</strong> later chapters.Notice the feedback paths <strong>in</strong> the circuits of Figure 1.8(b), (c), <strong>and</strong> (d). S<strong>in</strong>cesynaptic connections can be either excitatory or <strong>in</strong>hibitory, these circuits facilitatecontrol systems hav<strong>in</strong>g either positive or negative feedback. Of course, thesesimple circuits do not adequately portray the vast complexity of neuroanatomy.Now that we have an idea of how <strong>in</strong>dividual neurons operate <strong>and</strong> of howthey are put together, we can pose a fundamental question: How do theserelatively simple concepts comb<strong>in</strong>e to give the bra<strong>in</strong> its enormous abilities?The first significant attempt to answer this question was made <strong>in</strong> 1943, throughthe sem<strong>in</strong>al work by McCulloch <strong>and</strong> Pitts [24]. This work is important for manyreasons, not the least of which is that the <strong>in</strong>vestigators were the first people totreat the bra<strong>in</strong> as a computational organism.The McCulloch-Pitts theory is founded on five assumptions:1. The activity of a neuron is an all-or-none process.2. A certa<strong>in</strong> fixed number of synapses (> 1) must be excited with<strong>in</strong> a periodof latent addition for a neuron to be excited.3. The only significant delay with<strong>in</strong> the nervous system is synaptic delay.4. The activity of any <strong>in</strong>hibitory synapse absolutely prevents excitation of theneuron at that time.5. The structure of the <strong>in</strong>terconnection network does not change with time.Assumption 1 identifies the neurons as be<strong>in</strong>g b<strong>in</strong>ary: They are either onor off. We can therefore def<strong>in</strong>e a predicate, N t (t), which denotes the assertionthat the ith neuron fires at time t. The notation, -iATj(t), denotes the assertionthat the ith neuron did not fire at time t. Us<strong>in</strong>g this notation, we can describe

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!