10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.9.7: Intuitive preview of proof 153A B C D E F G H I J K L M N O P Q R S T U V W X Y Z -B✏ ✏✏✶ ✲ B AE✏ ✏✏✶ ✲ E DCH✏ ✏✏✶ ✲ H GFI.Z✏ ✏✏✶Y ✲Ẕ ZYXWVUTSRQPONMLKJIHGFEDCBA-Figure 9.5. A non-confusablesubset of inputs for the noisytypewriter.010 1001001110010011100001000010011000010101001101110000110010101110100111011011111110000100001001100001010100110111000011001010111010011101101111111N = 1 N = 2 N = 4Figure 9.6. Extended channelsobtained from a binary symmetricchannel with transitionprobability 0.15.How does this translate into the terms of the theorem? The following tableexplains.The theoremAssociated with each discretememoryless channel, there is anon-negative number C.For any ɛ > 0 <strong>and</strong> R < C, for largeenough N,there exists a block code of length N <strong>and</strong>rate ≥ R<strong>and</strong> a decoding algorithm,such that the maximal probability ofblock error is < ɛ.How it applies to the noisy typewriterThe capacity C is log 2 9.No matter what ɛ <strong>and</strong> R are, we set the blocklength N to 1.The block code is {B, E, . . . , Z}. The value of K is given by2 K = 9, so K = log 2 9, <strong>and</strong> this code has rate log 2 9, which isgreater than the requested value of R.The decoding algorithm maps the received letter to the nearestletter in the code;the maximal probability of block error is zero, which is lessthan the given ɛ.9.7 Intuitive preview of proofExtended channelsTo prove the theorem for any given channel, we consider the extended channelcorresponding to N uses of the channel. The extended channel has |A X | Npossible inputs x <strong>and</strong> |A Y | N possible outputs. Extended channels obtainedfrom a binary symmetric channel <strong>and</strong> from a Z channel are shown in figures9.6 <strong>and</strong> 9.7, with N = 2 <strong>and</strong> N = 4.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!