06.06.2022 Views

B. P. Lathi, Zhi Ding - Modern Digital and Analog Communication Systems-Oxford University Press (2009)

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

744

INTRODUCTION TO INFORMATION THEORY

TABLE 13.4

Original Source

Messages

Probabilities

Code

Reduced Source

m1m1

m1m2

m2m1

rn2m 2

0.64

0.16

0.16

0.04

0 0.64

11

0.20

100 O.l 6

101

0 0.64

107 0.36

u_r-----

0

1

TABLE 13.5

Messages

Probabilities

Code

m1m1m1 0.512

m1m1m2 0.128

m1m2m1 0.128

m2m1m1 0.128

m1m2m2 0.032

m2m1m2 0.032

m2m2m1 0.032

m2m2m2 0.008

0

100

101

110

11100

11101

11110

11111

In this case the average word length L' is

L' = 0.64(1) + 0.1 6(2) + 0.16(3) + 0.04(3)

= 1.56

This is the word length for two messages of the original source. Hence L, the word length

per message, is

and

L'

L = - = 0.78

2

0.72

rJ = - = 0.923

0.78

If we proceed with N = 3 ( the third-order extension of the source), we have eight possible

messages, and following the Huffman procedure, we find the code as shown in Table 13.5.

The word length L" is

L" = (0.512) 1 + (0.128 + 0.128 + 0.128)3

+ (0.032 + 0.032 + 0.032)5 + (0.008)5

= 2.184

Then,

L''

L = - = 0.728

3

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!