Lecture notes on turbo codes.

C**on**catenated Coding• Parallel C**on**catenated C**on**voluti**on**al Codes (PCCC) or “Turbo” CodesFirst Paper: “NEAR SHANNON LIMIT ERROR-CORRECTING CODINGAND DECODING: TURBO – CODES(1)” byClaude Berrou, Alain Glavieux and Punya ThitimajshimaEcole Nati**on**ale Superieure des Telecommunicati**on**s (ENST) de Bretagne , FranceInternati**on**al Communicati**on**s C**on**ference (ICC), Geneva, Switzerland May 1993,pp. 1064-1070.• use systematic **codes** – n**on**systematic **on**es have equivalent forms• systematic **codes** with feed forward encoders produce less powerful **codes**.C**on**voluti**on**al **codes** in their systematic feedback form are equivalent to then**on**systematic form in distance and nearest neighbor path properties. Howeverat low signal to noise ratio (SNR), the bit error rate (BER) is slightly better.Figures of FB-CC and FF-CC for R=1/2 v=2.u kx k,1x k,2x k,1 =u ku kx k,2- 9 -

d kdelayline (L 1 )InterleavingX kY kC 1(2, 1, 2) resursivesystematicc**on**voluti**on**al codeY 1kY 2kC 2(2, 1, 2) resursivesystematicc**on**voluti**on**al codeWe can now decode the informati**on** sequence or move to the sec**on**d decoderwhich will decode for the same informati**on** sequence but in the interleaved form.Now we’ll make use of the L e (i) generated by the first encoder.This can be given as a priori (L e (i) as L I2 (i) -after interleaving) informati**on** to thesec**on**d decoder.So effectively we can interleave L sys (i) and L e (i) and feed it to the sec**on**d decoder.p ap (i t =-1) and p ap (i t =1) can be used associated with γ, in the calculati**on** of α,β forthe sec**on**d decoder .Thus the output of the sec**on**d decoder will be of the formL app (i) = L sys2 (i) + L I2 (i) + L e2 (i)- 10 -

This finishes the first or initial full decoding step- step 0.L e2 (i) reflects the extrinsic informati**on** generated by the sec**on**d decoder. Now in asimilar manner as earlier we can give - feedback the first decoder with L e2 (i) (γtocalculate α,β) after deinterleaving as new a priori L I1 (i) informati**on** to the firstdecoder.Decoding step 1First DecoderL app (i) = L sys (i) + L I1 (i) + L e1 (i)Sec**on**d Decoder (after interleaving L e1 (i))L app (i) = L sys (i) + L I2 (i) + L e2 (i)Then from decoder two feedback L e2 (i) as before to the first decoder afterdeinterleaving – and the process repeats.This is known as iterative decoding or **turbo** decoding.• Remember it is always the extrinsic informati**on** that we pass to the nextdecoder as new a priori. We d**on**’t feedback a priori or intrinsic informati**on**why?– that was already generated by the earlier decoder so no point in feedingback the same informati**on**.- 11 -

LEAVINGx k16-STATEDECODERDEC 1INTER-LEAVING16-STATEDECODERDEC 2y kDEINTER-DEINTER-LEAVINGDEMUX/INSERTIOdecoded outputdˆkThe p I (i t =±1) needed to calculate γcan also be obtained from L I (i) which will inturn be obtained from L e (i) of the earlier decoding stage.Reference: “Iterative Decoding of Binary Block and C**on**voluti**on**al Codes” byJoachim Hagenauer, Elke Offer, and Lutz Papke, IEEE Tans. Info. Theory, Vol.42, No.2, March 1996, pp. 429-445.Text Book Chapters:Bossert’s: Chapter 8 **on** C**on**voluti**on**al Codes and Appendix A **on** SCC, PCCFundamentals of C**on**voluti**on**al Coding: Chapter 7- Iterative Decoding- 12 -

PCCC –Reference: A soft-Input Soft-Output maximum A Posteriori (MAP) Moduleto Decode Parallel and Serial C**on**catenated CodesS. Benedetto, D. Divsalar, G. M**on**torsi, and F. PollaraTDA Progress Report 42-127, November 1996 from JPL web siteBlock diagram: EncoderENCODER1RATE=1/2TO CHANNELπENCODER2RATE=1/2NOT TRANSMITTEDTO CHANNELDecoderπ(.) represents log probabilities or working in the log domain.From DemodFrom Demodπ(c;I) π(c;O) Not used π(c;I) π(c;O)Not usedSISOSISO12π(u;I) π(u;O) π(u:I) π(u;O)ππ −1Decisi**on**SISO – soft-input, soft-output Module: implements MAP algorithm13

The general Trellis Encoder:INPUT TRELLIS OUTPUTENCODERucSCCC – Serial C**on**catenated C**on**voluti**on**al CodesEncoder:OUTERENCODERRATE=1/2pTO CHANNELINNERENCODERRATE=2/3Decoder:From Demodπ(c;I)SISOπ(c;O) Not usedπ(u;I)INNERπ(u;O) π(c:I)π −1SISOπ(c;O)OUTERπ(u;I)π(u;O)DECISION0π −114

Differences between PCCC and SCCC:PCCC: Updated probabilities (extrinsic) of code symbols are never used bythe decoding algorithmSCCC: Both, updated probabilities (extrinsic) of the input and code symbolsare used in the decoding algorithmDecoding algorithm is – Additive SISO algorithm (A-SISO) working in thelog domainπ(c;I)π(u;I)SISOπ(c;O)π(u;O)es E (e)s S (e)u(e), c(e)An Edge (e) of the Trellis Secti**on**The following functi**on**s are associated with each edge e• The starting state s S (e)• The ending state s E (e)• The input symbol• The output symbol15

The relati**on**ship between these functi**on**s depends **on** the particular encoder.As an example, in the case of systematic encoders, (s E (e),c(e)) also identifiesthe edge since u(e) is uniquely determined by c(e).Here it is **on**ly assumed that the pair (s S (e), u(e)) uniquely identifies theending state s E (e) – this assumpti**on** is always verified , as it is equivalent tosay that – given the initial trellis state, there is a **on**e-to-**on**e corresp**on**dencebetween input sequences and state sequences.The Additive SISO Algorithm (A-SISO)αk⎡( s)= log⎢∑exp{α⎢ E⎣e: s ( e)= s⎤[ c(e);I]}⎥,k⎥⎦Sk −1 [ s ( e)]+ πk[ u(e);I]+ πk=1,2,... nβk⎡( s)= log⎢exp{⎢∑ βk+⎣e: sS( e)= sk = n −1,..,01[ sE( e)]+πk + 1[ u(e);I]+πk + 1⎤[ c(e);I]}⎥,⎥⎦These are forward and backward recursi**on**s. At time k, the output(extrinsic) probability distributi**on**s are computed as (approximately)πk⎡⎤⎢SE( c;O)= log ∑exp{αk−1 [ s ( e)]+ πk[ u(e);I]+ βk[s ( e)]}⎥⎢⎣e:c(e)= c⎥⎦⎡⎤⎢SEπk( u;O)= log ∑exp{αk−1 [ s ( e)]+ πk[ c(e);I]+ βk[s ( e)]}⎥⎢⎣e:u(e)= u⎥⎦with initial values α 0 (s) = 0 for s=S 0 otherwise α 0 (s) = -∞16

similarly βn(s) = 0 for s=S n otherwise β n (s) = -∞So we replace log and exp with maximum values. Thus we have thefollowing set of equati**on**s.Eα k (s) = max (e:s (e)=s) {α k-1 [s S (e) + π k [u(e);I] + π k [c(e);I]}, k=1,2..,nSβ k (s) = max (e:s (e)=s) {β k+1 [s E (e) + π k+1 [u(e);I] + π k+1 [c(e);I]}, k=n-1,… ,0π k (c;O) = max (e:c(e)=c)) {α k-1 [s S (e) + π k [u(e);I] + β k [s E (e)]}π k (u;O) = max (e:u(e)=u)) {α k-1 [s S (e) + π k [c(e);I] + β k [s E (e)]}Generally for serial c**on**catenated **codes**:OUTER CODE- NON RECURSIVE or NON-Feed back , N**on** systematicINNER CODE- RECURSIVE SYSTEMATIC or Feed-back SystematicIt is seen that the performance of SCCC is usually better than that of PCCC.BERSCCCPCCCError Floor Effectin PCCCE b /N 017

The summati**on**s involved in the algorithm are calculated using trellis edges,rather than using pairs of states. This makes the algorithm general andcapable of dealing with parallel edges – suitable for TCM.The A-SISO algorithm at Bit LevelC**on**sider a rate ½ c**on**voluti**on**al encoder. U k input and C 1,k and C 2,k outputbits at time k, taking values {0,1}. Therefore **on** the trellis edges at time k wehave u k (e), c 1,k (e), c 2,k (e). Drop k for simplicity.Define the reliability (LLR) of a bit Z taking values {0,1} at time k asP [ Z = 1;.]λ [ Z;.]≡ logkk= πk[ Z = 1;.] − πk[ Z = 0;.]Pk[Z = 0;.]Eα k (s) = max (e:s (e)=s) {α k-1 [s S (e) + u(e)λ k [u;I] + c 1 (e)λ k [C 1 ;I] +c 2 (e)λ k [C 2 ;I] } + h αkβ k (s) = max (e:sS(e)=s) {β k+1 [s E (e) + u(e)λ k+1 [u;I] + c 1 (e)λ k+1 [C 1 ;I] +c 2 (e)λ k+1 [C 2 ;I] } + h βkwith initial values α 0 (s) = 0 if s=S 0 and α 0 (s) = - ∞ otherwise and β n (s)=0 ifs=S n and β n (s) = - ∞ otherwise. h αk and h βk are normalizati**on** c**on**stants.For the inner decoder, which is c**on**nected to the AWGN channel, we haveλ k [C 1 ;I]= (2A/σ 2 )r 1,k λ k [C 2 ;I]= (2A/σ 2 )r 2,k wherer i,k = A(2Z i - 1) +i ,k , i=1,2is the received samples at the output of the matched filter, c i ∈ {-1,1} andn i,k is the zero mean independent identically distributed (i.i.d.) Gaussiannoise samples with variance σ 2 .18

The extrinsic bit informati**on** for U,C 1 and C 2 can be obtained asλ k (U;O) = max (e:u(e)=1) {α k-1 [s S (e) + c 1 (e)λ k [C 1 ;I] + c 2 (e)λ k [C 2 ;I]+ β k [s E (e)]}- max (e:u(e)=0) {α k-1 [s S (e) + c 1 (e)λ k [C 1 ;I] + c 2 (e)λ k [C 2 ;I]+β k [s E (e)]}λ k (C 1 ;O) = max (e:c1(e)=1) {α k-1 [s S (e) + u(e)λ k [U;I] + c 2 (e)λ k [C 2 ;I]+ β k [s E (e)]}- max (e:c1(e)=0) {α k-1 [s S (e) + u(e)λ k [U;I] + c 2 (e)λ k [C 2 ;I]+β k [s E (e)]}λ k (C 2 ;O) = max (e:c2(e)=1) {α k-1 [s S (e) + u(e)λ k [U;I] + c 1 (e)λ k [C 1 ;I]+ β k [s E (e)]}- max (e:c2(e)=0) {α k-1 [s S (e) + u(e)λ k [U;I] + c 1 (e)λ k [C 1 ;I]+β k [s E (e)]}Parameters for PCCCIn general for a C**on**voluti**on**al code (CC) w min is defined as the minimuminformati**on** weight in the error events of the CC.w min = 1 for n**on** recursive **codes**w min = 2 for recursive **codes**.The error coefficient (in p b (error)) interleaving gain for a PCCC with largeinterleaving length goes as N 1-w min , where N is the interleaver length.19

Thus for recursive CC’s the interleaving gain (or BER reducti**on**) goes as1/N. On the other hand all n**on**recursive CC and block **codes** have w min = 1,so such **codes** are not useful in Parallel C**on**catenated Codes.The next most important c**on**stituent code parameter is z min , the minimumparity-check weight in the code sequences with w=2. For a large range ofSNR, the behaviour of PCCC is determined by the “effective free distance”d free,eff = 2 + 2 z min .It is possible to achieve z min = (n-1)(2 v-1 + 2) with a rate 1/n recursive CCwith memory v.PCCC’s exhibit error floor effect (defined as the change of slope of BERcurve for Turbo **codes**) at lower BER. Initially the coding gain increaseswith the number of iterati**on**s. However, after a number of iterati**on**s (10-12in AWGN) the performance improvement is marginal.Extensi**on**s to c**on**sider:1. Turbo **codes** with multilevel modulati**on**2. Turbo TCM3. Iterative Equalizati**on** and Decoding – Turbo Equalizati**on**.20