10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.50.2: The decoder 591number generators, seeded by the clock, to choose each r<strong>and</strong>om degree <strong>and</strong>each set of connections. Alternatively, the sender could pick a r<strong>and</strong>om key,κ n , given which the degree <strong>and</strong> the connections are determined by a pseudor<strong>and</strong>omprocess, <strong>and</strong> send that key in the header of the packet. As long as thepacket size l is much bigger than the key size (which need only be 32 bits orso), this key introduces only a small overhead cost.a)s 2 s 31 0 1 11b)s 10 150.2 The decoderDecoding a sparse-graph code is especially easy in the case of an erasure channel.The decoder’s task is to recover s from t = Gs, where G is the matrixassociated with the graph. The simple way to attempt to solve this problemis by message-passing. We can think of the decoding algorithm as thesum–product algorithm if we wish, but all messages are either completely uncertainmessages or completely certain messages. Uncertain messages assertthat a message packet s k could have any value, with equal probability; certainmessages assert that s k has a particular value, with probability one.This simplicity of the messages allows a simple description of the decodingprocess. We’ll call the encoded packets {t n } check nodes.1. Find a check node t n that is connected to only one source packets k . (If there is no such check node, this decoding algorithm halts atthis point, <strong>and</strong> fails to recover all the source packets.)(a) Set s k = t n .(b) Add s k to all checks t n ′ that are connected to s k :t n ′ := t n ′ + s k for all n ′ such that G n ′ k = 1. (50.1)(c) Remove all the edges connected to the source packet s k .c)d)e)f)111 1 01 01 11 01 11 0 1Figure 50.1. Example decoding fora digital fountain code withK = 3 source bits <strong>and</strong> N = 4encoded bits.2. Repeat (1) until all {s k } are determined.This decoding process is illustrated in figure 50.1 for a toy case where eachpacket is just one bit. There are three source packets (shown by the uppercircles) <strong>and</strong> four received packets (shown by the lower check symbols), whichhave the values t 1 t 2 t 3 t 4 = 1011 at the start of the algorithm.At the first iteration, the only check node that is connected to a sole sourcebit is the first check node (panel a). We set that source bit s 1 accordingly(panel b), discard the check node, then add the value of s 1 (1) to the checks towhich it is connected (panel c), disconnecting s 1 from the graph. At the startof the second iteration (panel c), the fourth check node is connected to a solesource bit, s 2 . We set s 2 to t 4 (0, in panel d), <strong>and</strong> add s 2 to the two checksit is connected to (panel e). Finally, we find that two check nodes are bothconnected to s 3 , <strong>and</strong> they agree about the value of s 3 (as we would hope!),which is restored in panel f.50.3 Designing the degree distributionThe probability distribution ρ(d) of the degree is a critical part of the design:occasional encoded packets must have high degree (i.e., d similar to K) inorder to ensure that there are not some source packets that are connected tono-one. Many packets must have low degree, so that the decoding process

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!