06.06.2022 Views

B. P. Lathi, Zhi Ding - Modern Digital and Analog Communication Systems-Oxford University Press (2009)

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

13. 1 Measure of Information 737

Hence,*

1 r-ary unit = log2 r bits (13.5)

A Note on the Unit of Information: Although it is tempting to use the r-ary unit as

a general unit of information, the binary unit bit (r = 2) is commonly used in the literature.

There is, of course, no loss of generality in using r = 2. These units can always be converted

into any other units by using Eq. ( 13.5). Henceforth, unless otherwise stated, we shall use the

binary unit (bit) for information. The bases of the logarithmic functions will be omitted, but

will be understood to be 2.

Average Information per Message: Entropy of a Source

Consider a memoryless source m emitting messages m1, 1112, ..• , m n with probabilities P1,

P2, ... , P n , respectively (P1 + P2 + • • • + P n = 1). A memoryless source implies that each

message emitted is independent of the previous message(s). By the definition in Eq. (13.3) [or

Eq. (13.4)], the information content of message m, is /;, given by

I; = log - bits

P;

(13.6)

The probability of occurrence of nz 1 is P;. Hence, the mean, or average, information per message

emitted by the source is given by I:?= 1 P;I; bits. The average information per message of a

source m is called its entropy, denoted by H(m). Hence,

n

H(m) = LP;I;

i=l

n

bits

= LP, log - bits

i=l

P;

n

= - LP; log P; bits

i=I

I

(13.7a)

(13.7b)

The entropy of a source is a function of the message probabilities. It is interesting to find the

message probability distribution that yields the maximum entropy. Because the entropy is a

measure of uncertainty, the probability distribution that generates the maximum uncertainty

will have the maximum entropy. On qualitative grounds, one expects entropy to be maximum

when all the messages are equiprobable. We shall now show that this is indeed true.

* In general,

1 r-ary unit = log 5

r s-ary units

The 10-ary unit of information is called the hartley in honor of R. V. L. Hartley, 2 who was one of the pioneers

(along with Nyquist 3 and Carson) in the area of information transmission in the 1920s. The rigorous mathematical

foundations of information theory, however, were established by C. E. Shannon 1 in 1948:

Sometimes the unit nat is used:

1 hartley = log 2 10 = 3.32 bits

1 nat = log2 e = 1.44 bits

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!