16 LIST OF FIGURES

3.4 A factor graph **and** its corresponding neural network. Each neuron corresponds

to an edge of the factor graph, hence there are 2.N edge .N iter

neurons in the network. . . . . . . . . . . . . . . . . . . . . . . . . . . . 99

3.5 Voronoi diagram (or Dirichlet tessellation): the partitioning of a plane

with n points into convex polygons such that each polygon contains exactly

one generating point **and** every point in a given polygon is closer to

its generating point than to any other. . . . . . . . . . . . . . . . . . . . 103

3.6 Evolution of the mutual information of variable to check messages along

iteration of BP **decoding** of various **codes**. Transmission on AWGN channel

with E b

N o

= 2dB. The upper hashed dotted curve corresponds to the

EXIT function of a cycle-free (3,6) **LDPC** code. The steps correspond to

BP **decoding** of various finite-length (3,6) **LDPC** **codes**. . . . . . . . . . . 105

3.7 Flow chart of the optimization procedure using a genetic algorithm to find

the best weights minimizing the cost function, for each iteration. N iter is

the number of **decoding** iterations for which we look for the correcting

weights. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108

4.1 All possible subgraphs subtended by three erroneous variable nodes. . . . 121

4.2 Errors configuration for Case 2. . . . . . . . . . . . . . . . . . . . . . . . 121

4.3 Errors configuration for Case 4. . . . . . . . . . . . . . . . . . . . . . . . 124

4.4 Errors configuration for Case 5. . . . . . . . . . . . . . . . . . . . . . . . 127

4.5 FER versus the crossover probability α for regular column-weight-four

MacKay code. The code rate is 0.89 **and** the code length is n = 1998. . . 131