08.02.2013 Views

Bernal S D_2010.pdf - University of Plymouth

Bernal S D_2010.pdf - University of Plymouth

Bernal S D_2010.pdf - University of Plymouth

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

3.4. EXISTING MODELS<br />

is on models that use Bayesian belief propagation or similar inference algorithms in the context<br />

<strong>of</strong> hierarchical generative models. In particular. Section 3.4.1 describes several implementa­<br />

tions <strong>of</strong> belief propagation using spiking neurons; Section 3.4.2 describes implementations <strong>of</strong><br />

belief propagation at a higher level <strong>of</strong> abstraction, specilically those attempting to model object<br />

perception In the visual system; and Section 3.4,3 compares diftcrcni speculative mappings <strong>of</strong><br />

the algorithm over the cortical laminar circuitry.<br />

3.4.1 Biological models with spiking neurons<br />

There have been several proposals for how spiking neurons can implement belief propagation<br />

in graphical models such as Bayesian networks. Three <strong>of</strong> these models are described in this<br />

subsection.<br />

3.4.1.1 Single layer hidden Markov model<br />

The first one. by Rao (2()04, 2()0.'i. 2U06). describes a single-layered recurrent network that is<br />

able to perform a simple visual motion detection task. The input lo the model is a I -dimensional<br />

30 pixel image, with a moving pixel. The model contains 30 neurons, each one coding the 30<br />

different states <strong>of</strong> a hidden Markov model. The stales code a specific spatial location (15 lo­<br />

cations with 2 pixel intervals), and the direction <strong>of</strong> motion (leftward or rightward). The firing<br />

rate <strong>of</strong> each neuron encodes the log <strong>of</strong> the posterior probability (belicO <strong>of</strong> being in a specific<br />

state, such that the neuron with the highest hring rate indicates the state <strong>of</strong> the world. To model<br />

the likelihood function, equivalent to the bottom-up messages, the input image was filtered by<br />

a set <strong>of</strong> feedforward weights (Gaussian functions), which represent the conditional probabil­<br />

ity function. The prior, or lop-down message, was approximated by multiplying the posterior<br />

probability at the previous time-step by a set <strong>of</strong> recurrent weights which represent the transition<br />

pnibabilities between states.<br />

The model was later extended by adding a second layer <strong>of</strong> Bayesian decision-making neurons<br />

that calculated a log-posterior ratio lo perform the random-dot motion detection task. A similar<br />

implementation using a simple two-level hierarchical network with two interconnected path­<br />

ways for features and locations, modelling the ventral and dorsal paths, was used to simulate<br />

114

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!