13.07.2015 Views

View File - University of Engineering and Technology, Taxila

View File - University of Engineering and Technology, Taxila

View File - University of Engineering and Technology, Taxila

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

3. For two independent sources, with states x r <strong>and</strong> x s , the probability <strong>of</strong>x r <strong>and</strong> x s equals the product <strong>of</strong> their individual state’s probability,pðx r Þpðx s Þ. Consequently, the information conveyed by the jointevent <strong>of</strong> the two symbols is1I rs ¼ log apðx r Þpðx s Þ1¼ log apðx r Þ þ log 1apðx s Þ¼ I r þ I sð3:61ÞThis expression shows that the information conveyed by the joint event <strong>of</strong> thetwo symbols equals the sum <strong>of</strong> the information conveyed by each symbol.Although important, it is not the information associated with sourcestates but rather the average information associated with the source. Hence, theentropy <strong>of</strong> the DMC source can be written asHðX Þ¼ PMi¼1pðx i Þ log a1pðx i Þð3:62ÞExample 3.4: As an illustration <strong>of</strong> the meaning <strong>of</strong> ‘‘entropy,’’ consider abinary source that generates two symbols 0 <strong>and</strong> 1, with correspondingprobabilities b <strong>and</strong> ð1 bÞ. From (3.62), the binary source entropy isexpressed as1HðbÞ ¼b log 2b þð1 bÞ log 121 bð3:63ÞUsing this expression, we can draw the entropy function as in Fig. 3.20.Figure 3.20 shows that at the two extremes (i.e., when b ¼ 0 <strong>and</strong> 1) thesource output is always certain, meaning the entropy is zero; HðbÞ ¼0.However, if the symbols’ probabilities are equally probable—that is, whenb ¼ 0:5—the source output is maximally uncertain <strong>and</strong> the entropy ismaximum, HðbÞ ¼1. This condition demonstrates that the entropy can beinterpreted as a measure <strong>of</strong> uncertainty.It is easy to generalize for a DMC with a binary symmetric channel(BSC) in which all M symbols are equally likely to be transmitted <strong>and</strong> wouldhave an entropyHðX Þ¼log 2 ðMÞð3:64ÞCopyright © 2002 by Marcel Dekker, Inc. All Rights Reserved.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!