10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.9.8: Further exercises 155given-typical-x set, 2 NH(Y |X) . So the number of non-confusable inputs, if theyare selected from the set of typical inputs x ∼ X N , is ≤ 2 NH(Y )−NH(Y |X) =2 NI(X;Y ) .The maximum value of this bound is achieved if X is the ensemble thatmaximizes I(X; Y ), in which case the number of non-confusable inputs is≤ 2 NC . Thus asymptotically up to C bits per cycle, <strong>and</strong> no more, can becommunicated with vanishing error probability.✷This sketch has not rigorously proved that reliable communication reallyis possible – that’s our task for the next chapter.9.8 Further exercisesExercise 9.15. [3, p.159] Refer back to the computation of the capacity of the Zchannel with f = 0.15.(a) Why is p ∗ 1 less than 0.5? One could argue that it is good to favourthe 0 input, since it is transmitted without error – <strong>and</strong> also arguethat it is good to favour the 1 input, since it often gives rise to thehighly prized 1 output, which allows certain identification of theinput! Try to make a convincing argument.(b) In the case of general f, show that the optimal input distributionisp ∗ 1/(1 − f)1 =. (9.19)(H2(f)/(1−f))1 + 2(c) What happens to p ∗ 1 if the noise level f is very close to 1?Exercise 9.16. [2, p.159] Sketch graphs of the capacity of the Z channel, thebinary symmetric channel <strong>and</strong> the binary erasure channel as a functionof f.⊲ Exercise 9.17. [2 ] What is the capacity of the five-input, ten-output channelwhose transition probability matrix is⎡⎤0.25 0 0 0 0.250.25 0 0 0 0.250 1 2 3 40.25 0.25 0 0 000.25 0.25 0 0 0120 0.25 0.25 0 0340 0.25 0.25 0 0? (9.20)560 0 0.25 0.25 078⎢ 0 0 0.25 0.25 09⎥⎣ 0 0 0 0.25 0.25 ⎦0 0 0 0.25 0.25Exercise 9.18. [2, p.159] Consider a Gaussian channel with binary input x ∈{−1, +1} <strong>and</strong> real output alphabet A Y , with transition probability density(y−xα)21Q(y | x, α, σ) = √ e− 2σ 2 , (9.21)2πσ 2where α is the signal amplitude.(a) Compute the posterior probability of x given y, assuming that thetwo inputs are equiprobable. Put your answer in the formP (x = 1 | y, α, σ) =1. (9.22)1 + e−a(y)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!