06.06.2022 Views

B. P. Lathi, Zhi Ding - Modern Digital and Analog Communication Systems-Oxford University Press (2009)

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

742 INTRODUCTION TO INFORMATION THEORY

The merit of any code is measured by its average length in comparison to H(m) (the

average minimum length). We define the code efficiency 17 as

H(m)

L

11=--

where Lis the average length of the code. In our present example,

The redundancy y is defined as

2.418

11 = 2.45

= 0.976

y =l- 17

= 0.024

Even though the Huffman code is a variable length code, it is uniquely decodable. If

we receive a sequence of Huffman-coded messages, it can be decoded only one way, that is,

without ambiguity. For instance, if the source in this exercise were to emit the message sequence

m1msm2m1 m4m3m6 . .. , it would be encoded as 001101000011010111 .... The reader may

verify that this message sequence can be decoded only one way, viz, m1 msm2m1 m4m3m6 . .. ,

even if there is no demarcation between individual messages. This uniqueness is assured by

the special property that no codeword is a prefix of another (longer) codeword.

A similar procedure is used to find a compact r-ary code. In this case we arrange the

messages in descending order of probability, combine the last r messages into one message,

and rearrange the new set (reduced set) in the order of descending probability. We repeat the

procedure until the final set reduces to r messages. Each of these messages is now assigned

one of the r numbers 0, 1, 2, . .. , r - I. We now regress in exactly the same way as in the

binary case until each of the original messages has been assigned a code.

For an r-ary code, we will have exactly r messages left in the last reduced set if, and

only if, the total number of original messages is r + k(r - 1), where k is an integer. This

is because each reduction decreases the number of messages by r - 1. Hence, if there is a

total of k reductions, the total number of original messages must be r + k(r - 1). In case the

original messages do not satisfy this condition, we must add some dummy messages with zero

probability of occurrence until this condition is fulfilled. For example, if r = 4 and the number

of messages n is 6, then we must add one dummy message with zero probability of occurrence

to make the total number of messages 7, that is, [4 + 1(4 - 1)], and proceed as usual. The

procedure is illustrated in Example 13 .1.

Example 13. 1 A memoryless source emits six messages with probabilities 0.3, 0.25, 0.15, 0.12, 0.1, and 0.08.

Find the 4-ary (quaternary) Huffman code. Determine its average word length, the efficiency,

and the redundancy.

In this case, we need to add one dummy message to satisfy the required condition of

r + k(r - 1) messages and proceed as usual. The Huffman code is found in Table 13.3.

The length L of this code is

L = 0.3(1) + 0.25(1) + 0.15(1) + 0.12(2) + 0.1(2) + 0.08(2) + 0(2)

= 1.3 4-ary digits

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!