10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.578 48 — Convolutional Codes <strong>and</strong> Turbo Codes1111111011011100101110101001100001110110010101000011001000010000received✲ ⊕ ✲ t(b)✻z 4 〈d z 3 〈d z 2 〈d z 1 〈d z 0✻✲ t(a)✲ ⊕ ❄ ✲ ⊕ ❄ ✲ ⊕ ❄ ✲ ⊕ ✛ ❵ s ( )1,2137 80 1 1 0 1 0 1 0 0 0 0 1 1 1 1 0Figure 48.6. The trellis for a k = 4code painted with the likelihoodfunction when the received vectoris equal to a codeword with justone bit flipped. There are threeline styles, depending on the valueof the likelihood: thick solid linesshow the edges in the trellis thatmatch the corresponding two bitsof the received string exactly;thick dotted lines show edges thatmatch one bit but mismatch theother; <strong>and</strong> thin dotted lines showthe edges that mismatch bothbits.In general a linear-feedback shift-register with k bits of memory has an impulseresponse that is periodic with a period that is at most 2 k − 1, correspondingto the filter visiting every non-zero state in its state space.Incidentally, cheap pseudor<strong>and</strong>om number generators <strong>and</strong> cheap cryptographicproducts make use of exactly these periodic sequences, though withlarger values of k than 7; the r<strong>and</strong>om number seed or cryptographic key selectsthe initial state of the memory. There is thus a close connection betweencertain cryptanalysis problems <strong>and</strong> the decoding of convolutional codes.48.3 Decoding convolutional codesThe receiver receives a bit stream, <strong>and</strong> wishes to infer the state sequence<strong>and</strong> thence the source stream. The posterior probability of each bit can befound by the sum–product algorithm (also known as the forward–backward orBCJR algorithm), which was introduced in section 25.3. The most probablestate sequence can be found using the min–sum algorithm of section 25.3(also known as the Viterbi algorithm). The nature of this task is illustratedin figure 48.6, which shows the cost associated with each edge in the trellisfor the case of a sixteen-state code; the channel is assumed to be a binarysymmetric channel <strong>and</strong> the received vector is equal to a codeword except thatone bit has been flipped. There are three line styles, depending on the valueof the likelihood: thick solid lines show the edges in the trellis that match thecorresponding two bits of the received string exactly; thick dotted lines showedges that match one bit but mismatch the other; <strong>and</strong> thin dotted lines showthe edges that mismatch both bits. The min–sum algorithm seeks the paththrough the trellis that uses as many solid lines as possible; more precisely, itminimizes the cost of the path, where the cost is zero for a solid line, one fora thick dotted line, <strong>and</strong> two for a thin dotted line.⊲ Exercise 48.2. [1, p.581] Can you spot the most probable path <strong>and</strong> the flippedbit?

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!