10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.10.4: Communication (with errors) above capacity 167(a) A r<strong>and</strong>om code . . .⇒(b) expurgated2. Since the average probability of error over all codes is < 2δ, there mustexist a code with mean probability of block error p B (C) < 2δ.Figure 10.5. How expurgationworks. (a) In a typical r<strong>and</strong>omcode, a small fraction of thecodewords are involved incollisions – pairs of codewords aresufficiently close to each otherthat the probability of error wheneither codeword is transmitted isnot tiny. We obtain a new codefrom a r<strong>and</strong>om code by deletingall these confusable codewords.(b) The resulting code has slightlyfewer codewords, so has a slightlylower rate, <strong>and</strong> its maximalprobability of error is greatlyreduced.3. To show that not only the average but also the maximal probability oferror, p BM , can be made small, we modify this code by throwing awaythe worst half of the codewords – the ones most likely to produce errors.Those that remain must all have conditional probability of error lessthan 4δ. We use these remaining codewords to define a new code. Thisnew code has 2 NR′ −1 codewords, i.e., we have reduced the rate from R ′to R ′ − 1/ N (a negligible reduction, if N is large), <strong>and</strong> achieved p BM < 4δ.This trick is called expurgation (figure 10.5). The resulting code maynot be the best code of its rate <strong>and</strong> length, but it is still good enough toprove the noisy-channel coding theorem, which is what we are trying todo here.p b ✻achievableC✲RIn conclusion, we can ‘construct’ a code of rate R ′ − 1/ N, where R ′ < C − 3β,with maximal probability of error < 4δ. We obtain the theorem as stated bysetting R ′ = (R + C)/2, δ = ɛ/4, β < (C − R ′ )/3, <strong>and</strong> N sufficiently large forthe remaining conditions to hold. The theorem’s first part is thus proved. ✷10.4 Communication (with errors) above capacityWe have proved, for any discrete memoryless channel, the achievability of aportion of the R, p b plane shown in figure 10.6. We have shown that we canturn any noisy channel into an essentially noiseless binary channel with rateup to C bits per cycle. We now extend the right-h<strong>and</strong> boundary of the regionof achievability at non-zero error probabilities. [This is called rate-distortiontheory.]We do this with a new trick. Since we know we can make the noisy channelinto a perfect channel with a smaller rate, it is sufficient to consider communicationwith errors over a noiseless channel. How fast can we communicateover a noiseless channel, if we are allowed to make errors?Consider a noiseless binary channel, <strong>and</strong> assume that we force communicationat a rate greater than its capacity of 1 bit. For example, if we requirethe sender to attempt to communicate at R = 2 bits per cycle then he musteffectively throw away half of the information. What is the best way to dothis if the aim is to achieve the smallest possible probability of bit error? Onesimple strategy is to communicate a fraction 1/R of the source bits, <strong>and</strong> ignorethe rest. The receiver guesses the missing fraction 1 − 1/R at r<strong>and</strong>om, <strong>and</strong>Figure 10.6. Portion of the R, p bplane proved achievable in thefirst part of the theorem. [We’veproved that the maximalprobability of block error p BM canbe made arbitrarily small, so thesame goes for the bit errorprobability p b , which must besmaller than p BM .]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!