10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.1.2: Error-correcting codes for the binary symmetric channel 11(a)(b)r5r1r2r3r7r4r611 1 *01 0 010 *(e)11 *0(c)00110 *0000✲(e ′ )(d)111 *0 * 0111 01 *0010Figure 1.15. Pictorialrepresentation of decoding of theHamming (7, 4) code. Thereceived vector is written into thediagram as shown in (a). In(b,c,d,e), the received vector isshown, assuming that thetransmitted vector was as infigure 1.13b <strong>and</strong> the bits labelledby ⋆ were flipped. The violatedparity checks are highlighted bydashed circles. One of the sevenbits is the most probable suspectto account for each ‘syndrome’,i.e., each pattern of violated <strong>and</strong>satisfied parity checks.In examples (b), (c), <strong>and</strong> (d), themost probable suspect is the onebit that was flipped.In example (e), two bits have beenflipped, s 3 <strong>and</strong> t 7 . The mostprobable suspect is r 2 , marked bya circle in (e ′ ), which shows theoutput of the decoding algorithm.Syndrome zUnflip this bit000none001r 7010r 6011r 4100r 5101r 1110r 2111r 3Algorithm 1.16. Actions taken bythe optimal decoder for the (7, 4)Hamming code, assuming abinary symmetric channel withsmall noise level f. The syndromevector z lists whether each parityIf you try flipping any one of the seven bits, you’ll find that a differentsyndrome is obtained in each case – seven non-zero syndromes, one for eachcheck is violated (1) or satisfied(0), going through the checks inthe order of the bits r 5 , r 6 , <strong>and</strong> r 7 .bit. There is only one other syndrome, the all-zero syndrome. So if thechannel is a binary symmetric channel with a small noise level f, the optimaldecoder unflips at most one bit, depending on the syndrome, as shown inalgorithm 1.16. Each syndrome could have been caused by other noise patternstoo, but any other noise pattern that has the same syndrome must be lessprobable because it involves a larger number of noise events.What happens if the noise actually flips more than one bit? Figure 1.15eshows the situation when two bits, r 3 <strong>and</strong> r 7 , are received flipped. The syndrome,110, makes us suspect the single bit r 2 ; so our optimal decoding algorithmflips this bit, giving a decoded pattern with three errors as shownin figure 1.15e ′ . If we use the optimal decoding algorithm, any two-bit errorpattern will lead to a decoded seven-bit vector that contains three errors.General view of decoding for linear codes: syndrome decodingWe can also describe the decoding problem for a linear code in terms of matrices.The first four received bits, r 1 r 2 r 3 r 4 , purport to be the four source bits; <strong>and</strong> thereceived bits r 5 r 6 r 7 purport to be the parities of the source bits, as defined bythe generator matrix G. We evaluate the three parity-check bits for the receivedbits, r 1 r 2 r 3 r 4 , <strong>and</strong> see whether they match the three received bits, r 5 r 6 r 7 . Thedifferences (modulo 2) between these two triplets are called the syndrome of thereceived vector. If the syndrome is zero – if all three parity checks are happy– then the received vector is a codeword, <strong>and</strong> the most probable decoding is

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!