08.02.2013 Views

Bernal S D_2010.pdf - University of Plymouth

Bernal S D_2010.pdf - University of Plymouth

Bernal S D_2010.pdf - University of Plymouth

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

3.4. EXISTING MODELS<br />

representations (suggested by experimental evidence), which are enforced during the learning<br />

stage, and improve recognition performance.<br />

A four-layer network with a 64x64 pixel input image simulates object recognition in the vi­<br />

sual system. The network managed to correctly recognize, segment and reconstruct occluded<br />

versions <strong>of</strong> the trained images, although no invariance to position and size transformations is<br />

achieved. The study illustrates some interesting properties, such as the possibility <strong>of</strong> simulat­<br />

ing imafiinalion by running the network generatively (i.e. top-down and not bottom-up input);<br />

and expeclalion-driven segmentation, whereby the top-down input (e.g. prior expectations) im­<br />

proves recognition in cluttered scenes. However, the model fails to provide mechanisms for<br />

position and scale invariance during recognition. Furthermore, despite being based on a gener­<br />

ative model, the resulting dynamic network derived from the simplified model is far from the<br />

original belief propagation scheme.<br />

3.4.2.4 Comparison and umclusions<br />

This subsection has outlined some <strong>of</strong> the attempts to model visual perception in the brain using<br />

the generative modelling approach, and in particular those employing algorithms similar to be­<br />

lief propagation. Table ^.2 lists the models, comparing the type <strong>of</strong> network, inference algorithm<br />

and results obtained in each case.<br />

The complexity that emerges from the large-scale and intricate cortical connectivity means ex­<br />

act inference methods are intractable, making it necessary to use approximate solutions such as<br />

loopy belief propagation (George and Hawkins 2009), sampling methods (Hinton et al. 2006,<br />

Lee and Mumford 2003. Lewicki and Sejnowski 1997) or variational methods (Murray and<br />

Kreut/.-Dclgado 2007, Kao and Ballard 1999.1-riston 2010). Sampling methods typically main­<br />

tain the probabilistic nature and structure <strong>of</strong> Bayesian networks, while variational approxima­<br />

tion methods yield a hierarchical dynamic network which deals with the optimization problem<br />

(minimizing the difference between the approximate and Uie true posterior distributions). Nev­<br />

ertheless, in both cases the resulting dynamics lead lo local, recursive message-passing scheme.s<br />

reminiscent <strong>of</strong> belief propagation.<br />

Exact inference is only possible when the generative model avoids physiological constraints,<br />

131

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!