10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.102 5 — Symbol Codes5.7 SummaryKraft inequality. If a code is uniquely decodeable its lengths must satisfy∑2 −l i≤ 1. (5.25)iFor any lengths satisfying the Kraft inequality, there exists a prefix codewith those lengths.Optimal source codelengths for an ensemble are equal to the Shannoninformation contentsl i = log 21p i, (5.26)<strong>and</strong> conversely, any choice of codelengths defines implicit probabilitiesq i = 2−l iz . (5.27)The relative entropy D KL (p||q) measures how many bits per symbol arewasted by using a code whose implicit probabilities are q, when theensemble’s true probability distribution is p.Source coding theorem for symbol codes. For an ensemble X, there existsa prefix code whose expected length satisfiesH(X) ≤ L(C, X) < H(X) + 1. (5.28)The Huffman coding algorithm generates an optimal symbol code iteratively.At each iteration, the two least probable symbols are combined.5.8 Exercises⊲ Exercise 5.19. [2 ] Is the code {00, 11, 0101, 111, 1010, 100100, 0110} uniquelydecodeable?⊲ Exercise 5.20. [2 ] Is the ternary code {00, 012, 0110, 0112, 100, 201, 212, 22}uniquely decodeable?Exercise 5.21. [3, p.106] Make Huffman codes for X 2 , X 3 <strong>and</strong> X 4 where A X ={0, 1} <strong>and</strong> P X = {0.9, 0.1}. Compute their expected lengths <strong>and</strong> comparethem with the entropies H(X 2 ), H(X 3 ) <strong>and</strong> H(X 4 ).Repeat this exercise for X 2 <strong>and</strong> X 4 where P X = {0.6, 0.4}.Exercise 5.22. [2, p.106] Find a probability distribution {p 1 , p 2 , p 3 , p 4 } such thatthere are two optimal codes that assign different lengths {l i } to the foursymbols.Exercise 5.23. [3 ] (Continuation of exercise 5.22.) Assume that the four probabilities{p 1 , p 2 , p 3 , p 4 } are ordered such that p 1 ≥ p 2 ≥ p 3 ≥ p 4 ≥ 0. LetQ be the set of all probability vectors p such that there are two optimalcodes with different lengths. Give a complete description of Q. Findthree probability vectors q (1) , q (2) , q (3) , which are the convex hull of Q,i.e., such that any p ∈ Q can be written aswhere {µ i } are positive.p = µ 1 q (1) + µ 2 q (2) + µ 3 q (3) , (5.29)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!