06.06.2022 Views

B. P. Lathi, Zhi Ding - Modern Digital and Analog Communication Systems-Oxford University Press (2009)

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

13.2 Source Encoding 74 1

TABLE 13.1

Original Source

Reduced Sources

Messages

Probabilities

0.30 0.30 0.30 0.43 0.57

0.25 0.25 __r 0.27 0.30

}-J 0.43

0.15

0.18 0.25 0.27

0.12 0.151

0. 18

0.08 0.12

0.10

TABLE 13.2

Original Source

Messages

Probabilities

m1 0.30

m2 0.25

m3 0.15

m4 0.12

ms 0.08

m6 0.10

Code

00 0.30

10 0.25

"'°{"·"

111

011 0.15

110 0.12

S1

00 0.30

10/ 0.27

11 0.25

010 0. 18

011

Reduced Sources

S2

00 {0.43

01 0.30

10 0.27

11

S3

S4

1 0.57 0

00 0.43 1

01

sequence. We now go back and assign the numbers O and 1 to the second digit for the two

messages that were aggregated in the previous step. We keep regressing in this way until the

first column is reached. The code finally obtained (for the first column) can be shown to be

optimum. The complete procedure is shown in Tables 13.1 and 13.2.

The optimum (Huffman) code obtained this way is also called a compact code. The

average length of the compact code in the present case is given by

n

L = L PiLi = 0.3(2) + 0.25(2) + 0.15(3) + 0.12(3) + 0.1 (3) + 0.08(3)

i=l

= 2.45 binary digits

The entropy H(m) of the source is given by

II

1

p. l

i=l

= 2.418 bits

H(m) = '°' P; log2 -

Hence, the minimum possible length (attained by an infinitely long sequence of messages)

is 2.418 binary digits. By using direct coding (the Huffman code), it is possible to attain an

average length of 2.45 bits in the example given. This is a close approximation of the optimum

performance attainable. Thus, little is gained by complex coding of a number of messages in

this case.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!