10.07.2015 Views

To the Graduate Council: I am submitting herewith a thesis written by ...

To the Graduate Council: I am submitting herewith a thesis written by ...

To the Graduate Council: I am submitting herewith a thesis written by ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Chapter 4: Algorithm Overview 564.2.4 Information MeasureInformation <strong>the</strong>ory is a relatively new branch of ma<strong>the</strong>matics that began only in <strong>the</strong>1940’s. The term “information <strong>the</strong>ory” still does not possess a unique definition butbroadly deals with <strong>the</strong> study of problems concerning systems that involve informationprocessing, information storage, information retrieval and decision making.The first studies in this direction were undertaken <strong>by</strong> Nyquist in 1924 and <strong>by</strong> Hartleyin 1928 (Equation 4.17) that recognized <strong>the</strong> logarithmic nature of <strong>the</strong> measure ofinformation. In 1948, Shannon published his seminal paper on <strong>the</strong> properties ofinformation sources and of <strong>the</strong> communication channels used to transmit <strong>the</strong> outputsof <strong>the</strong>se sources and <strong>the</strong> important definition of entropy as <strong>the</strong> measure of information(Equation 4.18).HHartley( p1 , p2,...pn) = log |{i; pi> 0,1≤i ≤ n }|(4.19)HShannon= −ni=pilog pi1 (4.20)In <strong>the</strong> past fifty years, literature on information <strong>the</strong>ory has grown quite voluminousand apart from communication <strong>the</strong>ory it has found deep applications in many social,physical and biological sciences, economics, statistics, accounting, language,psychology, ecology, pattern recognition, computer sciences and fuzzy sets.A key feature of Shannon’s information <strong>the</strong>ory is <strong>the</strong> term “information” that can oftenbe given a ma<strong>the</strong>matical meaning as a numerically measurable quantity, on <strong>the</strong> basisof a probabilistic model. This important measure has a very concrete operationalinterpretation for <strong>the</strong> communication engineers. We would like to summarize <strong>the</strong>various definitions of entropy in <strong>the</strong> literature as Table 4.2.The list that we have presented in Table 4.2 is not an exhaustive one though we havespanned a few important definitions involving par<strong>am</strong>eters and weights. Pap in [Pap,2002] briefs <strong>the</strong> history of information <strong>the</strong>ory and discusses various measures ofinformation while Reza [Reza, 1994] approaches information <strong>the</strong>ory from <strong>the</strong> codingaspect of communication <strong>the</strong>ory. We would like to emphasize that <strong>the</strong> difference inusing a discrete random variable and a continuous random variable. The analogousdefinition of Shannon’s entropy in <strong>the</strong> continuous case is called <strong>the</strong> differentialentropy (Equation 4.21).Hdifferenti al∞−∞= p( x )log( p( x )) dx(4.21)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!