To the Graduate Council: I am submitting herewith a thesis written by ...
To the Graduate Council: I am submitting herewith a thesis written by ...
To the Graduate Council: I am submitting herewith a thesis written by ...
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
Chapter 4: Algorithm Overview 58With <strong>the</strong> help of Figure 4.8 we would like to explain an issue with <strong>the</strong> Shannon typeentropy measures. As <strong>the</strong> resolution of <strong>the</strong> data increases, <strong>the</strong> number of points in <strong>the</strong>density is also going to increase and ∆ tends towards zero. Using Reiman’s definitionof integrals we can rewrite Equation 4.18 as−∆f( x )log( ∆f( x ) ) = −i− ∞ −∞i∆f( xi)log( f (x ) ) −i∆f( x )log( ∆ ) )i(4.22)f ( x )log f ( x )dx = lim( H Shannon + log( ∆ ))∆→0 (4.23)We see that as <strong>the</strong> number of points approaches <strong>the</strong> continuous random variable, <strong>the</strong>reis a quantum jump in <strong>the</strong> <strong>am</strong>ount of information measured. We needed a measure thatis normalized and improves with resolution. The measures that we have presented inTable 4.2 have an upper limit that is directly proportional to <strong>the</strong> number of charactersin a symbol. Since we need to have <strong>the</strong> shape information quantized and independentof resolution, we have studied different divergence measures such as KL divergence (Equation 4.24), Jenson-Shannon divergence (Equation 4.25) and Chi-Squareddivergence measures before extending Shannon’s definition for our CVM.p( x )H KL = − p( x )logq( x )(4.24)p + q H Shannon(p ) + H Shannon( q )H JS = H Shannon() −22where p is <strong>the</strong> density of <strong>the</strong> object of interest and q is <strong>the</strong> density of <strong>the</strong>reference. (4.25)We have discussed <strong>the</strong> supporting <strong>the</strong>ory for <strong>the</strong> proposed CVM algorithm. In <strong>the</strong> nextchapter we discuss implementation decisions for <strong>the</strong> algorithm and present <strong>the</strong>experimental results of our algorithm on different datasets.Figure 4.8: Resolution issue with Shannon type measures.