10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.18 1 — Introduction to <strong>Information</strong> <strong>Theory</strong>a source bit is flipped. Are parity bits or source bits more likely to beamong these three flipped bits, or are all seven bits equally likely to becorrupted when the noise vector has weight two? The Hamming codeis in fact completely symmetric in the protection it affords to the sevenbits (assuming a binary symmetric channel). [This symmetry can beproved by showing that the role of a parity bit can be exchanged witha source bit <strong>and</strong> the resulting code is still a (7, 4) Hamming code; seebelow.] The probability that any one bit ends up corrupted is the samefor all seven bits. So the probability of bit error (for the source bits) issimply three sevenths of the probability of block error.p b ≃ 3 7 p B ≃ 9f 2 . (1.48)Symmetry of the Hamming (7, 4) codeTo prove that the (7, 4) code protects all bits equally, we start from the paritycheckmatrix⎡1 1 1 0 1 0⎤0H = ⎣ 0 1 1 1 0 1 0 ⎦ . (1.49)1 0 1 1 0 0 1The symmetry among the seven transmitted bits will be easiest to see if wereorder the seven bits using the permutation (t 1 t 2 t 3 t 4 t 5 t 6 t 7 ) → (t 5 t 2 t 3 t 4 t 1 t 6 t 7 ).Then we can rewrite H thus:⎡H = ⎣1 1 1 0 1 0 00 1 1 1 0 1 00 0 1 1 1 0 1⎤⎦ . (1.50)Now, if we take any two parity constraints that t satisfies <strong>and</strong> add themtogether, we get another parity constraint. For example, row 1 asserts t 5 +t 2 + t 3 + t 1 = even, <strong>and</strong> row 2 asserts t 2 + t 3 + t 4 + t 6 = even, <strong>and</strong> the sum ofthese two constraints ist 5 + 2t 2 + 2t 3 + t 1 + t 4 + t 6 = even; (1.51)we can drop the terms 2t 2 <strong>and</strong> 2t 3 , since they are even whatever t 2 <strong>and</strong> t 3 are;thus we have derived the parity constraint t 5 + t 1 + t 4 + t 6 = even, which wecan if we wish add into the parity-check matrix as a fourth row. [The set ofvectors satisfying Ht = 0 will not be changed.] We thus defineH ′ =⎡⎢⎣1 1 1 0 1 0 00 1 1 1 0 1 00 0 1 1 1 0 11 0 0 1 1 1 0⎤⎥⎦ . (1.52)The fourth row is the sum (modulo two) of the top two rows. Notice that thesecond, third, <strong>and</strong> fourth rows are all cyclic shifts of the top row. If, havingadded the fourth redundant constraint, we drop the first constraint, we obtaina new parity-check matrix H ′′ ,⎡H ′′ = ⎣0 1 1 1 0 1 00 0 1 1 1 0 11 0 0 1 1 1 0⎤⎦ , (1.53)which still satisfies H ′′ t = 0 for all codewords, <strong>and</strong> which looks just likethe starting H in (1.50), except that all the columns have shifted along one

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!