13.07.2015 Views

View File - University of Engineering and Technology, Taxila

View File - University of Engineering and Technology, Taxila

View File - University of Engineering and Technology, Taxila

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

FIGURE 3.20Entropy <strong>of</strong> a binary source.Thus, the entropy could be bounded as0 HðX Þlog 2 ðMÞ ð3:65ÞThe information transmission rate can be written asR r ¼ r b HðX Þwhere r b is the data transmission rate (bit=sec).ð3:66ÞConditional Probabilities. Where information is mutually sharedbetween the channel’s input <strong>and</strong> output, the entropy measurement is conditionalon the quality <strong>of</strong> information received either way. The entropy is theninterpreted as a conditional measure <strong>of</strong> uncertainty. Consider a BSC as anexample <strong>of</strong> a DMC, as seen in Fig. 3.21.The DMC is defined by a set <strong>of</strong> transition probabilities. The probability<strong>of</strong> transmitting x 1 <strong>and</strong> receiving y 1 is pðx 1 ; y 1 Þ. By Bayes’ theorem, thisprobability can be rewritten aspðx 1 ; y 1 Þ¼pðy 1 jx 1 Þpðx 1 Þð3:67aÞAlternatively,pðx 1 ; y 1 Þ¼pðx 1 jy 1 Þpðy 1 Þwhereð3:67bÞpðy 1 jx 1 Þ¼probability <strong>of</strong> receiving y 1 given that x 1 was transmitted.pðx 1 jy 1 Þ¼probability <strong>of</strong> transmitting x 1 given that y 1 was received.pðx 1 Þ¼probability <strong>of</strong> transmitting x 1 .pðy 1 Þ¼probability <strong>of</strong> receiving y 1 .Copyright © 2002 by Marcel Dekker, Inc. All Rights Reserved.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!