10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.25.5: Solutions 333<strong>and</strong> any non-horizontal edge is associated with a one bit. (Thus in this representationwe no longer need to label the edges in the trellis.) Figure 25.7bshows the trellis corresponding to the parity-check matrix of equation (25.20).25.5 Solutionst Likelihood Posterior probability0000000 0.026 0.30060001011 0.00041 0.00470010111 0.0037 0.04230011100 0.015 0.16910100110 0.00041 0.00470101101 0.00010 0.00120110001 0.015 0.16910111010 0.0037 0.04231000101 0.00041 0.00471001110 0.00010 0.00121010010 0.015 0.16911011001 0.0037 0.04231100011 0.00010 0.00121101000 0.00041 0.00471110100 0.0037 0.04231111111 0.000058 0.0007Table 25.8. The posteriorprobability over codewords forexercise 25.4.Solution to exercise 25.4 (p.329). The posterior probability over codewords isshown in table 25.8. The most probable codeword is 0000000. The marginalposterior probabilities of all seven bits are:n Likelihood Posterior marginalsP (y n | t n = 1) P (y n | t n = 0) P (t n = 1 | y) P (t n = 0 | y)1 0.2 0.8 0.266 0.7342 0.2 0.8 0.266 0.7343 0.9 0.1 0.677 0.3234 0.2 0.8 0.266 0.7345 0.2 0.8 0.266 0.7346 0.2 0.8 0.266 0.7347 0.2 0.8 0.266 0.734So the bitwise decoding is 0010000, which is not actually a codeword.Solution to exercise 25.9 (p.330). The MAP codeword is 101, <strong>and</strong> its likelihoodis 1/8. The normalizing constant of the sum–product algorithm isZ = α I = 3/ 16. The intermediate α i are (from left to right) 1/ 2, 1/ 4, 5/ 16, 4/ 16;the intermediate β i are (from right to left), 1/ 2, 1/ 8, 9/ 32, 3/ 16. The bitwisedecoding is: P (t 1 = 1 | y) = 3/4; P (t 1 = 1 | y) = 1/4; P (t 1 = 1 | y) = 5/6. Thecodewords’ probabilities are 1/ 12, 2/ 12, 1/ 12, 8/ 12 for 000, 011, 110, 101.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!