10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.568 47 — Low-Density Parity-Check CodesEmpirical Bit-Error Probability0.10.010.0010.00011e-051e-06F 0 = [f 0 + f 1 ] + [f A + f B ]F 1 = [f 0 − f 1 ] + [f A − f B ]F A = [f 0 + f 1 ] − [f A + f B ]F B = [f 0 − f 1 ] − [f A − f B ]Irreg GF(8)Irreg GF(2)Reg GF(16)TurboLuby-0.4 -0.2 0 0.2 0.4 0.6 0.8Signal to Noise ratio (dB)Reg GF(2)GallileoAlgorithm 47.16. The Fouriertransform over GF (4).The Fourier transform F of afunction f over GF (2) is given byF 0 = f 0 + f 1 , F 1 = f 0 − f 1 .Transforms over GF (2 k ) can beviewed as a sequence of binarytransforms in each of kdimensions. The inversetransform is identical to theFourier transform, except that wealso divide by 2 k .Figure 47.17. Comparison of regular binary Gallager codes with irregular codes, codes over GF (q),<strong>and</strong> other outst<strong>and</strong>ing codes of rate 1/ 4. From left (best performance) to right: Irregularlow-density parity-check code over GF (8), blocklength 48 000 bits (Davey, 1999); JPLturbo code (JPL, 1996) blocklength 65 536; Regular low-density parity-check over GF (16),blocklength 24 448 bits (Davey <strong>and</strong> MacKay, 1998); Irregular binary low-density paritycheckcode, blocklength 16 000 bits (Davey, 1999); Luby et al. (1998) irregular binary lowdensityparity-check code, blocklength 64 000 bits; JPL code for Galileo (in 1992, this wasthe best known code of rate 1/4); Regular binary low-density parity-check code: blocklength40 000 bits (MacKay, 1999b). The Shannon limit is at about −0.79 dB. As of 2003, evenbetter sparse-graph codes have been constructed.GF (8), <strong>and</strong> GF (16) perform nearly one decibel better than comparable binaryGallager codes.The computational cost for decoding in GF (q) scales as q log q, if the appropriateFourier transform is used in the check nodes: the update rule forthe check-to-variable message,⎡⎤rmn a = ∑ ⎣∑∏H mn ′x n ′ = z m⎦ q x jmj , (47.15)x:x n=an ′ ∈N (m)j∈N (m)\nis a convolution of the quantities qmj a , so the summation can be replaced bya product of the Fourier transforms of qmja for j ∈ N (m)\n, followed byan inverse Fourier transform. The Fourier transform for GF (4) is shown inalgorithm 47.16.Make the graph irregularThe second way of improving Gallager codes, introduced by Luby et al. (2001b),is to make their graphs irregular. Instead of giving all variable nodes the samedegree j, we can have some variable nodes with degree 2, some 3, some 4, <strong>and</strong>a few with degree 20. Check nodes can also be given unequal degrees – thishelps improve performance on erasure channels, but it turns out that for theGaussian channel, the best graphs have regular check degrees.Figure 47.17 illustrates the benefits offered by these two methods for improvingGallager codes, focussing on codes of rate 1/ 4. Making the binary codeirregular gives a win of about 0.4 dB; switching from GF (2) to GF (16) gives

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!