06.06.2022 Views

B. P. Lathi, Zhi Ding - Modern Digital and Analog Communication Systems-Oxford University Press (2009)

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

782 INTRODUCTION TO INFORMATION THEORY

Among all the real-valued random variable vectors that have zero mean and satisfy the condition

we have 11 C x = Cov(x, x) = E{xx T }

max H(x) = - [N - log(2ne) + log det (C x )] .

Px(X): Cov(x, x T )=C x 2

(13.94)

This means that Gaussian vector distribution has maximum entropy among all real random

vectors of the same covariance matrix.

Now consider a flat fading MIMO channel with matrix gain H. The N x M channel matrix

H connects the M x 1 input vector x and N x 1 output vector y such that

y=H•x + w (13.95)

where w is the N x 1 additive white Gaussian noise vector with zero mean and covariance

matrix C w , As shown in Fig. 13.13, a MIMO system consists of M transmit antennas at the

transmitter end and N receive antennas at the receiver end. Each transmit antenna can transmit

to all N receive antennas. Given a fixed channel H of dimensions N x M (i.e., M transmit

antennas and N receive antennas), the mutual information between the channel input and output

vectors is

/(x, y) = H(y) - H(ylx)

= H(y) - H(H · x + wlx)

(13.96a)

(13.96b)

Recall that under the condition that x is known, H • x is a constant mean. Hence, the conditional

entropy of y given x is

H(ylx) = H(H · x + wlx) = H(w)

(13.97)

and

/(x, y) = H(y) - H(w)

1

= H(y) - 2 [N - log2 (2ne) + log det (Cw)]

(13.98a)

(13.98b)

Figure 13. 13

MIMO system

with M transmit

antennas and N

receive

antennas.

Transmitter

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!