06.06.2022 Views

B. P. Lathi, Zhi Ding - Modern Digital and Analog Communication Systems-Oxford University Press (2009)

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

754 INTRODUCTION TO INFORMATION THEORY

Measuring Channel Capacity

The channel capacity C s is the maximum value of H(x) - H(xly); naturally, C s ::::: max H(x)

[because H(xly) 2: O]. But H(x) is the average information per input symbol. Hence, C s

is always less than (or equal to) the maximum average information per input symbol. If

we use binary symbols at the input, the maximum value of H(x) is 1 bit, occurring when

P(x1 ) = P(x2) = ½ - Hence, for a binary channel, C s ::::: 1 bit per binary digit. If we use

r-ary symbols, the maximum value of H r (x) is 1 r-ary unit. Hence, C s ::::: I r-ary unit per

symbol.

Verification of Error-Free Communication over a BSC

We have shown that over a noisy channel, C s bits of information can be transmitted per symbol.

If we consider a binary channel, this means that for each binary digit (symbol) transmitted,

the received information is C s bits (Cs ::::: 1). Thus, to transmit 1 bit of information, we

need to transmit at least 1 / C s binary digits. This gives a code efficiency C s and redundancy

1 - C s . Here, the transmission of information means error-free transmission, since /(x; y)

was defined as the transmitted information minus the loss of information caused by channel

noise.

The problem with this derivation is that it is based on a certain speculative definition of

information [Eq. (13.1)]. And based on this definition, we defined the information lost during

the transmission over the channel. We really have no direct proof that the information lost over

the channel will oblige us in this way. Hence, the only way to ensure that this whole speculative

structure is sound is to verify it. If we can show that C s bits of error-free information can be

transmitted per symbol over a channel, the verification will be complete. A general case will

be discussed later. Here we shall verify the results for a BSC.

Let us consider a binary source emitting messages at a rate of a digits per second. We

accumulate these information digits over T seconds to give a total of aT digits. Because

aT digits form 2 aT possible combinations, our problem is now to transmit one of these

2 aT supermessages every T seconds. These supermessages are transmitted by a code of

word length /JT digits, with f3 > a to ensure redundancy. Because {3T digits can form

2/J T distinct patterns (vertices of a {JT-dimensional hypercube), and we have only 2 aT messages,

we are utilizing only a 2-<fJ- a ) T

fraction of the vertices. The remaining vertices

are deliberately unused, to combat noise. If we let T ➔ oo, the fraction of vertices used

approaches 0. Because there are {3T digits in each transmitted sequence, the number of

digits received in error will be exactly f3TP e when T ➔ oo. We now construct Hamming

spheres of radius {3TP e each around the 2 aT vertices used for the messages. When any message

is transmitted, the received message will be in the Hamming sphere surrounding the

vertex corresponding to that message. We use the following decision rule: If a received

sequence falls inside or on a sphere surrounding message mi, then the decision is "mi is

transmitted." If T ➔ oo, the decision will be without error if all the 2 aT spheres are

non overlapping.

Of all the possible sequences of {3T digits, the number of sequences that differ from a

given sequence by exactly j digits is ( f3 [ ) (see Example 8.6). Hence, K, the total number of

sequences that differ from a given sequence by less than or equal to {3TP e digits, is

fJTPe

K = L ({3T )

j=O }

(13.26)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!