06.06.2022 Views

B. P. Lathi, Zhi Ding - Modern Digital and Analog Communication Systems-Oxford University Press (2009)

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

768 INTRODUCTION TO INFORMATION THEORY

Figure 13.5

Channel capacity

vs.

bandwidth for a

channel with

white Gaussian

noise and fixed

signal power.

C

1.44 S

N

0 2

B

SIN

3

Figure 13.6

(a) Signal space

representation of

transmitted and

received signals

and noise signal.

(b) Choice of

signals for

error-free

communication.

(b)

(a)

Verification of Error-Free Communication over

a Continuous Channel

Using the concepts of information theory, we have shown that it is possible to transmit errorfree

information at a rate of B log2 (1 + S /N) bit/s over a channel band-limited to B Hz. The

signal power is S, and the channel noise is white Gaussian with power N. This theorem can be

verified in a way similar to that used for the verification of the channel capacity of a discrete

case. This verification using signal space is so general that it is in reality an alternate proof of

the capacity theorem.

Let us consider M-ary communication with M equiprobable messages m1, m2, ... , mM

transmitted by signals s1 (t), s2 (t), ... , SM (t). All signals are time-limited with duration T and

have an essential bandwidth B Hz. Their powers are less than or equal to S. The channel is

band-limited to B, and the channel noise is white Gaussian with power N.

All the signals and noise waveforms have 2BT + 1 dimensions. In the limit we shall let

T ➔ oo. Hence 2BT » 1, and the number of dimensions will be taken as 2BT in our future

discussion. Because the noise power is N, the energy of the noise waveform of T -second

duration is NT. Given signal power S, the maximum signal energy is ST. Because signals and

noise are independent, the maximum received energy is (S + N)T. Hence, all the received

signals will lie in a WT-dimensional hypersphere of radius J(S + N)T (Fig. 13.6a). A typical

received signal s;(t) + n(t) has an energy (S; + N)T, and the point r representing this signal

lies at a distance of J(S; + N)T from the origin (Fig. 13.6a). The signal vector s;, the noise

vector n, and the received vector r are shown in Fig. 13.6a. Because

ls;l = M, lnl = m, lrl = J(S; + N)T (13.65)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!