08.02.2013 Views

Bernal S D_2010.pdf - University of Plymouth

Bernal S D_2010.pdf - University of Plymouth

Bernal S D_2010.pdf - University of Plymouth

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

4.5. FEEDBACK PROCES.SING<br />

4.5.3.2 Dynamics <strong>of</strong> loopy belief pri>pa(>ulion<br />

Loopy belief propagation also requires all A and n messages to be initialized to a Hal disiri-<br />

huiion. so that, even during the lirst iteration, all nodes <strong>of</strong> the network propagate upward and<br />

downward messages. Except for the A me.ssage.s from the dummy nodes, which will contain<br />

evidence from the image, and the n messages from the root nodes, which will propagate ihe<br />

prior, the rest <strong>of</strong> Ihe messages will propagate Hal distributions during Ihe first time step. During<br />

the following lime sleps the dummy nodes' evidence will propagate to ihe top layers, merging<br />

with Ihe downward prior informalion and being modulated al each layer by the inherent param­<br />

eters <strong>of</strong> the network contained in the CPTs, The dynamics <strong>of</strong> loopy belief propagation in ihe<br />

propo.sed Bayesian network are illuslraled in Figure 4.18.<br />

The computational cost <strong>of</strong> updating the beliefs <strong>of</strong> nodes at all layers at every lime step is very<br />

high. An altemaiive approach to reduce this cost is to update only ihe belief <strong>of</strong> a given layer al<br />

each time step as in tree-structured networks. For the majority <strong>of</strong> results present in this thesis<br />

the model implemented an Hpwnrrf belief update as opposed to the complelc belief update. This<br />

is illustrated in Figure 4.19. For purposes <strong>of</strong> clarity each simulation step, z,,,,,,, thus consists <strong>of</strong><br />

five loopy belief propagation steps. This simplification <strong>of</strong> loopy belief propagation is justified in<br />

the sense thai evidence arrives from the lower layer dummy nodes and thus only the belief <strong>of</strong> the<br />

nodes in ihe adjacen! layer will provide meaningful informalion. All the computation required<br />

to calculate the K messages and beliefs in the upper layers during the first time-steps shown<br />

in Figure 4.18 is now saved. Further, it means evidence propagated from the dummy nodes<br />

will only be modulated at each layer by the initial flat top-down K mes.sages, thus increasing<br />

the chance <strong>of</strong> a good recognition performance. The main disadvantage <strong>of</strong> this method is the<br />

asymmetry between bottom-up and top-down propagation <strong>of</strong> evidence, as a belief update or<br />

''Caption for Figure 4, |8, Dynamics <strong>of</strong> loopy bcliyt pnipagaiiun in ihc propo.sed nuxlel. Al 1=0 all messages<br />

are inilializcJ in a Hat diMribution (symbolized wiih a / in ihe figure) encepi for Ihe A message from the dummy<br />

nodes and the lup level Jr message or prior distrihuiion. Ai i^l, ouL-e the initial flat jz messages are multiplied by<br />

Ihe correiiponding node CtTs they generate non-flai beliefs and subsequent non-tlai JT and A messages (see Figure<br />

3.7 for a numeric example). The non-flat feedforward k messages, ^dumm\' *ill ^'so modulate Ihe belief at each<br />

node and subsequent A messages generated. However, the A message generated by nodes wiih an incoming flat A<br />

me.iisage will also generate Mat output A messagCfi, For this reason, il takes 4 time-steps (the diameter <strong>of</strong> the network)<br />

to propagate the lower level evidence, k^umn^. to ihe top node Tlie bottom right image symbolically illusirates the<br />

exisience <strong>of</strong> loops in the nelwork and how this leads to the recursive nature and doulile-counting <strong>of</strong> mes.sages in<br />

loopy belief propagation.<br />

184

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!