01.08.2013 Views

Information Theory, Inference, and Learning ... - MAELabs UCSD

Information Theory, Inference, and Learning ... - MAELabs UCSD

Information Theory, Inference, and Learning ... - MAELabs UCSD

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981<br />

You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.<br />

584 49 — Repeat–Accumulate Codes<br />

1<br />

0.1<br />

0.01<br />

0.001<br />

0.0001<br />

total<br />

detected<br />

undetected<br />

1e-05<br />

0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 1.1<br />

2000<br />

1800<br />

1600<br />

1400<br />

1200<br />

1000<br />

800<br />

600<br />

400<br />

200<br />

0<br />

0 20 40 60 80 100 120 140 160 180<br />

3000<br />

2500<br />

2000<br />

1500<br />

1000<br />

500<br />

0<br />

0 20 40 60 80 100 120 140 160 180<br />

(a) (ii.b) Eb/N0 = 0.749 dB (ii.c) Eb/N0 = 0.846 dB<br />

49.4 Empirical distribution of decoding times<br />

1000<br />

100<br />

10<br />

1<br />

10 20 30 40 50 60 70 80 90100<br />

1000<br />

100<br />

10<br />

1<br />

10 20 30 40 50 60 70 80 90 100<br />

(iii.b) (iii.c)<br />

It is interesting to study the number of iterations τ of the sum–product algorithm<br />

required to decode a sparse-graph code. Given one code <strong>and</strong> a set of<br />

channel conditions, the decoding time varies r<strong>and</strong>omly from trial to trial. We<br />

find that the histogram of decoding times follows a power law, P (τ) ∝ τ −p ,<br />

for large τ. The power p depends on the signal-to-noise ratio <strong>and</strong> becomes<br />

smaller (so that the distribution is more heavy-tailed) as the signal-to-noise<br />

ratio decreases. We have observed power laws in repeat–accumulate codes<br />

<strong>and</strong> in irregular <strong>and</strong> regular Gallager codes. Figures 49.3(ii) <strong>and</strong> (iii) show the<br />

distribution of decoding times of a repeat–accumulate code at two different<br />

signal-to-noise ratios. The power laws extend over several orders of magnitude.<br />

Exercise 49.1. [5 ] Investigate these power laws. Does density evolution predict<br />

them? Can the design of a code be used to manipulate the power law in<br />

a useful way?<br />

49.5 Generalized parity-check matrices<br />

I find that it is helpful when relating sparse-graph codes to each other to use<br />

a common representation for them all. Forney (2001) introduced the idea of<br />

a normal graph in which the only nodes are <strong>and</strong> <strong>and</strong> all variable nodes<br />

have degree one or two; variable nodes with degree two can be represented on<br />

edges that connect a node to a node. The generalized parity-check matrix<br />

is a graphical way of representing normal graphs. In a parity-check matrix,<br />

the columns are transmitted bits, <strong>and</strong> the rows are linear constraints. In a<br />

generalized parity-check matrix, additional columns may be included, which<br />

represent state variables that are not transmitted. One way of thinking of these<br />

state variables is that they are punctured from the code before transmission.<br />

State variables are indicated by a horizontal line above the corresponding<br />

columns. The other pieces of diagrammatic notation for generalized parity-<br />

Figure 49.3. Histograms of<br />

number of iterations to find a<br />

valid decoding for a<br />

repeat–accumulate code with<br />

source block length K = 10 000<br />

<strong>and</strong> transmitted blocklength<br />

N = 30 000. (a) Block error<br />

probability versus signal-to-noise<br />

ratio for the RA code. (ii.b)<br />

Histogram for x/σ = 0.89,<br />

Eb/N0 = 0.749 dB. (ii.c)<br />

x/σ = 0.90, Eb/N0 = 0.846 dB.<br />

(iii.b, iii.c) Fits of power laws to<br />

(ii.b) (1/τ 6 ) <strong>and</strong> (ii.c) (1/τ 9 ).

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!