06.06.2022 Views

B. P. Lathi, Zhi Ding - Modern Digital and Analog Communication Systems-Oxford University Press (2009)

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

13. l Measure of Information 735

The reader will hardly notice the first headline unless he or she lives near the North or the

South Pole. The reader will be very, very interested in the second. But what really catches

the reader's attention is the third headline. This item will attract much more interest than the

other two headlines. From the viewpoint of "common sense," the first headline conveys hardly

any information; the second conveys a large amount of information; and the third conveys yet

a larger amount of information. If we look at the probabilities of occurrence of these three

events, we find that the probability of occurrence of the first event is unity (a certain event),

that of the second is low (an event of small but finite probability), and that of the third is

practically zero (an almost impossible event). If an event of low probability occurs, it causes

greater surprise and, hence, conveys more information than the occurrence of an event of larger

probability. Thus, the information is connected with the element of surprise, which is a result

of uncertainty, or unexpectedness. The more unexpected the event, the greater the surprise,

and hence the more information. The probability of occurrence of an event is a measure of its

unexpectedness and, hence, is related to the information content. Thus, from the point of view

of common sense, the amount of information received from a message is directly related to the

uncertainty or inversely related to the probability of its occurrence. If P is the probability of

occurrence of a message and / is the information gained from the message, it is evident from

the preceding discussion that when P ---+ l , I ---+ 0 and when P ---+ 0, I ---+ oo, and, in general

a smaller P gives a larger /. This suggests the following information measure:

l

I~ loo - b p

(13.1)

Engineering Measure of Information

We now show that from an engineering point of view, the information content of a message

is consistent with the intuitive measure [Eq. (13.l)]. What do we mean by an engineering

point of view? An engineer is responsible for the efficient transmission of messages. For this

service the engineer will charge a customer an amount proportional to the information to be

transmitted. But in reality the engineer will charge the customer in proportion to the time that

the message occupies the channel bandwidth for transmission. In short, from an engineering

point of view, the amount of information in a message is proportional to the (minimum) time

required to transmit the message. We shall now show that this concept of information also

leads to Eq. (13.1). This implies that a message with higher probability can be transmitted in a

shorter time than that required for a message with lower probability. This fact may be verified

by the example of the transmission of alphabetic symbols in the English language using Morse

code. This code is made up of various combinations of two symbols (such as a dash and a dot

in Morse code, or pulses of height A and -A volts). Each letter is represented by a certain

combination of these symbols, called the codeword, which has a certain length. Obviously,

for efficient transmission, shorter codewords are assigned to the letters e, t, a, and o, which

occur more frequently. The longer codewords are assigned to letters x, k, q, and z, which occur

less frequently. Each letter may be considered to be a message. It is obvious that the letters that

occur more frequently (with higher probability of occurrence) need a shorter time to transmit

(shorter codewords) than those with smaller probability of occurrence. We shall now show

that on the average, the time required to transmit a symbol (or a message) with probability of

occurrence P is indeed proportional to log (1/P).

For the sake of simplicity, let us begin with the case of binary messages m1 and m2, which

are equally likely to occur. We may use binary digits to encode these messages, representing

m1 and m2 by the digits O and 1, respectively. Clearly, we must have a minimum of one binary

digit ( which can assume two values) to represent each of the two equally likely messages. Next,

consider the case of the four equiprobable messages m1, m2, m3, and m4. If these messages are

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!