10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.180 11 — Error-Correcting Codes <strong>and</strong> Real ChannelsInfinite inputsHow much information can we convey in one use of a Gaussian channel? Ifwe are allowed to put any real number x into the Gaussian channel, we couldcommunicate an enormous string of N digits d 1 d 2 d 3 . . . d N by setting x =d 1 d 2 d 3 . . . d N 000 . . . 000. The amount of error-free information conveyed injust a single transmission could be made arbitrarily large by increasing N,<strong>and</strong> the communication could be made arbitrarily reliable by increasing thenumber of zeroes at the end of x. There is usually some power cost associatedwith large inputs, however, not to mention practical limits in the dynamicrange acceptable to a receiver. It is therefore conventional to introduce acost function v(x) for every input x, <strong>and</strong> constrain codes to have an averagecost ¯v less than or equal to some maximum value. A generalized channelcoding theorem, including a cost function for the inputs, can be proved – seeMcEliece (1977). The result is a channel capacity C(¯v) that is a function ofthe permitted cost. For the Gaussian channel we will assume a cost(a)(b)✲✛ g . ..v(x) = x 2 (11.22)such that the ‘average power’ x 2 of the input is constrained. We motivated thiscost function above in the case of real electrical channels in which the physicalpower consumption is indeed quadratic in x. The constraint x 2 = ¯v makesit impossible to communicate infinite information in one use of the Gaussianchannel.Infinite precisionIt is tempting to define joint, marginal, <strong>and</strong> conditional entropies for realvariables simply by replacing summations by integrals, but this is not a welldefined operation. As we discretize an interval into smaller <strong>and</strong> smaller divisions,the entropy of the discrete distribution diverges (as the logarithm of thegranularity) (figure 11.4). Also, it is not permissible to take the logarithm ofa dimensional quantity such as a probability density P (x) (whose dimensionsare [x] −1 ).There is one information measure, however, that has a well-behaved limit,namely the mutual information – <strong>and</strong> this is the one that really matters, sinceit measures how much information one variable conveys about another. In thediscrete case,I(X; Y ) = ∑ P (x, y)P (x, y) logP (x)P (y) . (11.23)x,yFigure 11.4. (a) A probabilitydensity P (x). Question: can wedefine the ‘entropy’ of thisdensity? (b) We could evaluatethe entropies of a sequence ofprobability distributions withdecreasing grain-size g, but these∫entropies tend to1P (x) log dx, which is notP (x)gindependent of g: the entropygoes up by one bit for everyhalving ∫ of g.P (x) logintegral.1dx is an illegalP (x)Now because the argument of the log is a ratio of two probabilities over thesame space, it is OK to have P (x, y), P (x) <strong>and</strong> P (y) be probability densities<strong>and</strong> replace the sum by an integral:∫P (x, y)I(X; Y ) = dx dy P (x, y) log (11.24)P (x)P (y)∫P (y | x)= dx dy P (x)P (y | x) logP (y) . (11.25)We can now ask these questions for the Gaussian channel: (a) what probabilitydistribution P (x) maximizes the mutual information (subject to the constraintx 2 = v)? <strong>and</strong> (b) does the maximal mutual information still measure themaximum error-free communication rate of this real channel, as it did for thediscrete channel?

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!