10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.178 11 — Error-Correcting Codes <strong>and</strong> Real ChannelsHow could such a channel be used to communicate information? Considertransmitting a set of N real numbers {x n } N n=1 in a signal of duration T madeup of a weighted combination of orthonormal basis functions φ n (t),N∑x(t) = x n φ n (t), (11.7)n=1where ∫ T0dt φ n (t)φ m (t) = δ nm . The receiver can then compute the scalars:y n≡∫ T0dt φ n (t)y(t) = x n +∫ T0dt φ n (t)n(t) (11.8)≡ x n + n n (11.9)for n = 1 . . . N. If there were no noise, then y n would equal x n . The whiteGaussian noise n(t) adds scalar noise n n to the estimate y n . This noise isGaussian:n n ∼ Normal(0, N 0 /2), (11.10)where N 0 is the spectral density introduced above. Thus a continuous channelused in this way is equivalent to the Gaussian channel defined at equation(11.5). The power constraint ∫ T0 dt [x(t)]2 ≤ P T defines a constraint onthe signal amplitudes x n ,∑nx 2 n ≤ P T ⇒ x 2 n ≤ P TN . (11.11)Before returning to the Gaussian channel, we define the b<strong>and</strong>width (measuredin Hertz) of the continuous channel to be:W = N max2T , (11.12)where N max is the maximum number of orthonormal functions that can beproduced in an interval of length T . This definition can be motivated byimagining creating a b<strong>and</strong>-limited signal of duration T from orthonormal cosine<strong>and</strong> sine curves of maximum frequency W . The number of orthonormalfunctions is N max = 2W T . This definition relates to the Nyquist samplingtheorem: if the highest frequency present in a signal is W , then the signalcan be fully determined from its values at a series of discrete sample pointsseparated by the Nyquist interval ∆t = 1/ 2W seconds.So the use of a real continuous channel with b<strong>and</strong>width W , noise spectraldensity N 0 , <strong>and</strong> power P is equivalent to N/T = 2W uses per second of aGaussian channel with noise level σ 2 = N 0 /2 <strong>and</strong> subject to the signal powerconstraint x 2 n ≤ P/ 2W.φ 1 (t)φ 2 (t)φ 3 (t)x(t)Figure 11.1. Three basis functions,<strong>and</strong> a weighted combination ofthem, x(t) = ∑ Nn=1 x nφ n (t), withx 1 = 0.4, x 2 = − 0.2, <strong>and</strong> x 3 = 0.1.Definition of E b /N 0Imagine that the Gaussian channel y n = x n + n n is used with an encodingsystem to transmit binary source bits at a rate of R bits per channel use. Howcan we compare two encoding systems that have different rates of communicationR <strong>and</strong> that use different powers x 2 n? Transmitting at a large rate R isgood; using small power is good too.It is conventional to measure the rate-compensated signal-to-noise ratio bythe ratio of the power per source bit E b = x 2 n /R to the noise spectral densityN 0 :E b /N 0 =x2 n2σ 2 R . (11.13)E b /N 0 is one of the measures used to compare coding schemes for Gaussianchannels.E b /N 0 is dimensionless, but it isusually reported in the units ofdecibels; the value given is10 log 10 E b /N 0 .

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!