# Hybrid LDPC codes and iterative decoding methods - i3s

Hybrid LDPC codes and iterative decoding methods - i3s

2.7 Proofs of theorems in Chapter 2 93

i · k is the scalar product between the binary representations of both elements i and k.

The mutual information I of a symmetric probability vector p, under the all-zero codeword

assumption, is defined by

x p = 1 − E p

(log q (1 +

As in the binary case, we want to prove that

q−1

x p = 1 − x f

(

where x f is defined by [x f = 1 − E p log q (1 + ∑ q−1 f i

i=1

We want to prove that

x p = 1 − x f

that says

Since q−1 ∑

i=0

E f

(log q (1 +

q−1

i=1

q−1

E f

(log q (1 +

i=1

q−1

E f

(log q (1 +

f i = q−1 ∑

i=1

f i

f 0

)

f i

f 0

)

f i

f 0

)

)

)

)

i=1

f 0

)

f i

f 0

)

)

)

Proof:

= 1 − E p

(log q (1 +

= E p

(1 − log q ( 1 )

)

p 0

= E p

(

logq (qp 0 ) )

q−1

i=1

p i

p 0

)

f 0 = 1 implies

)

q−1

(

E f

(log q ( f i ) = E p logq (qp 0 ) ) (2.44)

q−1 ∑

i=0 k=0

i=0

p j (−1) i·k , it finally remains to prove that

)

which is ensured by

q−1 q−1

∑ ∑

p j (−1) i·k = 0

i=0 k=1

q−1 q−1

∑ ∑

p j

k=1 i=0

(−1) i·k = 0 (2.45)

q−1

(−1) i·k = 0, ∀k = {1 . . .q − 1}

i=0

We are going to demonstrate this last expression.

Let say that k has m bits equal to 1 in its binary representation.