10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.About Chapter 13In Chapters 8–11, we established Shannon’s noisy-channel coding theoremfor a general channel with any input <strong>and</strong> output alphabets. A great deal ofattention in coding theory focuses on the special case of channels with binaryinputs. Constraining ourselves to these channels simplifies matters, <strong>and</strong> leadsus into an exceptionally rich world, which we will only taste in this book.One of the aims of this chapter is to point out a contrast between Shannon’saim of achieving reliable communication over a noisy channel <strong>and</strong> the apparentaim of many in the world of coding theory. Many coding theorists take astheir fundamental problem the task of packing as many spheres as possible,with radius as large as possible, into an N-dimensional space, with no spheresoverlapping. Prizes are awarded to people who find packings that squeeze in anextra few spheres. While this is a fascinating mathematical topic, we shall seethat the aim of maximizing the distance between codewords in a code has onlya tenuous relationship to Shannon’s aim of reliable communication.205

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!