# Hybrid LDPC codes and iterative decoding methods - i3s Hybrid LDPC codes and iterative decoding methods - i3s

1.4 Decoding of LDPC codes by Belief Propagation algorithm 29

decoding. The former will be studied in the last chapter, while the latter is the decoding

algorithm that we use, unless the contrary is specified.

A priori probabilities on the value of each symbol of the codeword are first computed

thanks to the channel outputs. For non-binary LDPC codes, these probabilities correspond

to the probability that the symbol be equal to {α 0 , . . .,α q−1 }.

Although a maximum likelihood decoding of LDPC codes is possible , the complexity

increases too much as soon as enough long binary codes are considered, and it is

reasonable to expect that the complexity will not be lower for high order fields. That is

why  then  proposed a sub-optimum decoding algorithm, finally revised by 

and  for the case of factor graphs. This algorithm is known as Sum-Product  or

BP  algorithm, and it spreads along edges messages forwarding probabilities or LDR.

To each edge, two messages are associated, one for each direction. The principle of BP is

Bayes rule applied locally and iteratively to estimate a posteriori probabilities (APP) of

each codeword symbol. It has been shown that over a cycle-free graph (tree case), local

factorization of Bayes rules leads to exact computation of APP of variable nodes because

messages going into a node are independent from each other. However, in , it has been

shown that the linear codes which have a cycle free Tanner graph have either a minimum

distance lower or equal to 2 when the code rate R is greater than one half, or a minimum

distance upper-bounded by 2 otherwise. It is therefore impossible to consider such codes

R

because the minimum distance that cannot grow with the codeword length, which is a

desirable property. Hence, any finite length LDPC code has a cycle Tanner graph, then

messages going into a node are not independent. Thus, APP are not computed exactly,

and the algorithm is not optimal anymore in the sense it does not correspond anymore

to ML decoding. However, the BP decoding of LDPC code is based on the cycle-free

assumption thanks to the property of the graph of the code, which is sparse by definition

of this class of codes.

Decoding principles apply similarly on GF(q) codes, for q > 2, as for GF(2) codes.

This section describes only the non-binary case. Since non-binary codeword symbols

are considered as random variables in GF(q), messages on the edges of the graph are

q sized vectors. BP algorithm intends to compute the APP of each codeword symbol.

For instance, for the symbol corresponding to variable node v i , the algorithm handles

conditional probability vector p i = (P(v i = α 0 |y i , S i ), . . ., P(v i = α q−1 |y i , S i )), where

P(v i = α 0 |y i , S i ) is the probability that the sent codeword symbol i is equal to α 0 , given

that the channel output for the i th symbol is y i and given S i the event that all parity-check

equations connected to variable node v i are fulfilled. The computation of p i depends on

the structure of the code factor graph through events S i for all i. If input messages y i are

independent, the probabilities on the graph are computed exactly up to g iterations, if g is

4

the length of the shortest cycle in the graph, also called the girth of the graph.

To describe the BP decoding, {l (t)

p i v} i∈{1,...,dv} denotes the set of messages getting in

a degree d v variable node v at the t th iteration, and {r (t)

vp i

} i∈{1,...,dv} the set of messages

going out of this variable node. Index pv denotes the direction of message propagation

(permutation node → variable node), vp denotes the opposite direction. Messages getting

in (resp. out) a parity-check node c are similarly denoted by {r (t)

p i c} i∈{1,...,dc} (resp.

More magazines by this user
Similar magazines