10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.26.2: The sum–product algorithm 337messages to their respective neighbours from the start.For all leaf variable nodes n: q n→m (x n ) = 1 (26.13)For all leaf factor nodes m: r m→n (x n ) = f m (x n ). (26.14)We can then adopt the procedure used in Chapter 16’s message-passing rulesetB (p.242): a message is created in accordance with the rules (26.11, 26.12)only if all the messages on which it depends are present. For example, infigure 26.4, the message from x 1 to f 1 will be sent only when the messagefrom f 4 to x 1 has been received; <strong>and</strong> the message from x 2 to f 2 , q 2→2 , can besent only when the messages r 4→2 <strong>and</strong> r 5→2 have both been received.Messages will thus flow through the tree, one in each direction along everyedge, <strong>and</strong> after a number of steps equal to the diameter of the graph, everymessage will have been created.The answers we require can then be read out. The marginal function ofx n is obtained by multiplying all the incoming messages at that node.Z n (x n ) =∏r m→n (x n ). (26.15)m∈M(n)x 1 x 2 x 3❣ ❣ ❣✥✥✥✥✥✥✥✥✥✥✥✥✥✥✥✥✥✥✥✥✥❅ ❅ f 1 f 2 f 3 f 4 f 5Figure 26.4. Our model factorgraph for the function P ∗ (x)(26.4).The normalizing constant Z can be obtained by summing any marginalfunction, Z = ∑ x nZ n (x n ), <strong>and</strong> the normalized marginals obtained fromP n (x n ) = Z n(x n )Z . (26.16)⊲ Exercise 26.2. [2 ] Apply the sum–product algorithm to the function defined inequation (26.4) <strong>and</strong> figure 26.1. Check that the normalized marginalsare consistent with what you know about the repetition code R 3 .Exercise 26.3. [3 ] Prove that the sum–product algorithm correctly computesthe marginal functions Z n (x n ) if the graph is tree-like.Exercise 26.4. [3 ] Describe how to use the messages computed by the sum–product algorithm to obtain more complicated marginal functions in atree-like graph, for example Z 1,2 (x 1 , x 2 ), for two variables x 1 <strong>and</strong> x 2 thatare connected to one common factor node.Starting <strong>and</strong> finishing, method 2Alternatively, the algorithm can be initialized by setting all the initial messagesfrom variables to 1:for all n, m: q n→m (x n ) = 1, (26.17)then proceeding with the factor message update rule (26.12), alternating withthe variable message update rule (26.11). Compared with method 1, this lazyinitialization method leads to a load of wasted computations, whose resultsare gradually flushed out by the correct answers computed by method 1.After a number of iterations equal to the diameter of the factor graph,the algorithm will converge to a set of messages satisfying the sum–productrelationships (26.11, 26.12).Exercise 26.5. [2 ] Apply this second version of the sum–product algorithm tothe function defined in equation (26.4) <strong>and</strong> figure 26.1.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!