# Hybrid LDPC codes and iterative decoding methods - i3s

Hybrid LDPC codes and iterative decoding methods - i3s

98 Chapitre 3 : Machine Learning Methods for Code and Decoder Design

The (h, f, g) combination defines the type of the neuron.

Summator Neuron The most common definition of a formal neuron corresponds to

the particular case where the input function h is a dot product between the input and the

weights.

x 1

x 2

x 3

w 2

w 1

w 3

w 4

h f

g

y

h: dot product

f: any kind of non-linear function

(echelon, sigmoid, Gaussian, ...)

defining the type of neuron

g: identity function

x 4

y = f( ∑ 4

i=1 w ix i )

Figure 3.2 : An artificial neuron which computes the weighted sum of the inputs, and the

apply the activation function f.

Polynomial Neuron Such a kind of neuron [75] is depicted on figure 3.3.

x 1

x 2

x 3

w 2

w 1

w 3

w 4

h f

g

y

h: polynomial

f: any kind of non-linear function

(echelon, sigmoid, Gaussian, ...)

g: identity function

x 4

E.g., for an order-2 neuron: y = f( ∑ i,k w iw k x i x k )

Figure 3.3 : A polynomial neuron.

Modelization of the decoder

Since the goal is to build a Tanner graph on which the BP decoder is as less suboptimal

as possible, we translate the decoding on the Tanner graph of the code as the process of

an Artificial Neural Network (ANN). Let a message from variable node v to check node c

at iteration t be described by a 2-dimensional probability vector x (t)

vc = (x (t)

vc (0), x (t)

vc (1)) T ,

where x (t)

vc (0) and x (t)

vc (1) correspond to the conditional probabilities for the variable node

v to be equal to 0 or 1, respectively. The ( Logarithmic ) Density Ratio (LDR) m (t)

vc , associated

with x (t)

vc , is defined as m vc (t) (t)

= log x

. The same holds for a message p (t)

cv from

vc (0)

x (t)

vc (1)

More magazines by this user
Similar magazines