Entropy and Mutual Information
Entropy and Mutual Information
Entropy and Mutual Information
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
<strong>Information</strong> InequalityLet p(x), q(x) be two probability functions definedfor r<strong>and</strong>om variable X, thenD(p||q) ≥ 0.To prove, let A be the support set of p(x). Then−D(p||q) = − ∑ x∈A≤ log( ∑ x∈Ap(x) log p(x)q(x) = ∑ x∈Ap(x) q(x)p(x) ) = log(∑ x∈A≤ log( ∑ x∈Xq(x)) = log 1 = 0.p(x) log q(x)p(x)q(x))<strong>Entropy</strong> <strong>and</strong> <strong>Mutual</strong> <strong>Information</strong> – p. 18