Views
3 years ago

Generalised information and entropy measures in physics

Generalised information and entropy measures in physics

498 C. Becksatisfies all

498 C. Becksatisfies all four of the Khinchin axioms. Indeed, up toan arbitrary multiplicative constant, one can easilyshow (see, e.g. [1]) that this is the only entropic formthat satisfies all four Khinchin axions, and that itfollows uniquely (up to a multiplicative constant) fromthese postulates. k denotes the Boltzmann constant,which in the remaining sections will be set equal to 1.For the uniform distribution, p i ¼ 1/W, the Shannonentropy takes on its maximum valueS ¼ k ln W;ð11Þwhich is Boltzmann’s famous formula, carved on hisgrave in Vienna (Figure 2). Maximising the Shannonentropy subject to suitable constraints leads toordinary statistical mechanics (see Section 4.3). Inthermodynamic equilibrium, the Shannon entropy canbe identified as the ‘physical’ entropy of the system,with the usual thermodynamic relations. Generally, theShannon entropy has an enormous range of applicationsnot only in equilibrium statistical mechanics butalso in coding theory, computer science, etc.It is easy to verify that S is a concavefunction of the probabilities p i , which is an importantproperty to formulate statistical mechanics.Remember that concavity of a differentiable functionf(x) means f 00 (x) 0 for all x. For the Shannonentropy one has@@p iS ¼ ln p i 1; ð12Þ@ 2@p i @p jS ¼ 1 p id ij 0; ð13Þand hence, as a sum of concave functions of the p i ,itisconcave.In classical mechanics, one often has a continuousvariable u with some probability density p(u),rather than discrete microstates i with probabilitiesRp i . In this case the normalisation condition reads11pðuÞ du ¼ 1, and the Shannon entropy associatedwith this probability density is defined asZ 1S ¼ dupðuÞ lnðspðuÞÞ; ð14Þ1where s is a scale parameter that has the same dimensionas the variable u. For example, if u is a velocity(measured in units of m s 71 ), then p(u), as a probabilitydensity of velocities, has the dimension s m 71 , sincep(u)du is a dimensionless quantity. As a consequence,one needs to introduce the scale parameter s in Equation(14) to make the argument of the logarithmdimensionless.Mono for printcolour onlineFigure 2. The grave of Boltzmann in Vienna. On top of thegravestone the formula S ¼ k log W is engraved. Boltzmannlaid the foundations for statistical mechanics, but his ideaswere not widely accepted during his time. He committedsuicide in 1906.Besides the Shannon information, there are lotsof other information measures. We will discuss someof the most important examples in the next section.Some information measures are more suitable thanothers for the description of various types ofcomplex systems. We will discuss the axiomaticfoundations that lead to certain classes of informationmeasures. Important properties to check for agiven information measure are convexity, additivity,composability, and stability. These properties canhelp to select the most suitable generalised informationmeasure to describe a given class of complexsystems.2. More general information measures2.1. The Re´nyi entropiesWe may replace Axiom 4 by the less stringentcondition (9), which just states that the entropy ofindependent systems should be additive. In this caseone ends up with other information measures which

Contemporary Physics 499are called the Re´nyi entropies [26]. These are definedfor an arbitrary real parameter q asS ðRÞq ¼ 1q 1 ln X p q i :ð15ÞiThe summation is over all events i with p i 6¼ 0. TheRe´nyi entropies satisfy the Khinchin Axioms 1–3 andthe additivity condition (9). Indeed, they follow uniquelyfrom these conditions, up to a multiplicative constant.For q ! 1 they reduce to the Shannon entropy:limq!1 SðRÞ q ¼ S; ð16Þas can be easily derived by setting q ¼ 1 þ e and doinga perturbative expansion in the small parameter e inEquation (15).The Re´nyi information measures are important forthe characterisation of multifractal sets (i.e. fractalswith a probability measure on their support [1]), aswell as for certain types of applications in computerscience. But do they provide a good informationmeasure to develop a generalised statistical mechanicsfor complex systems?At first sight it looks nice that the Re´nyi entropiesare additive for independent subsystems for general q,just as the Shannon entropy is for q ¼ 1. But for nonindependentsubsystems I and II this simplicityvanishes immediately: there is no simple formula ofexpressing the total Re´nyi entropy of a joint system asa simple function of the Re´nyi entropies of theinteracting subsystems.Does it still make sense to generalise statisticalmechanics using the Re´nyi entropies? Another problemarises if one checks whether the Re´nyi entropiesare a concave function of the probabilities. The Re´nyientropies do not possess a definite concavity – thesecond derivative with respect to the p i can be positiveor negative. For formulating a generalised statisticalmechanics, this poses a serious problem. Othergeneralised information measures are better candidates– we will describe some of those in the following.2.2. The Tsallis entropiesThe Tsallis entropies (also called q-entropies) are givenby the following expression [2]:!Sq ðTÞ ¼ 1q 1 1 X Wp q i : ð17ÞOne finds definitions similar to Equation (17) alreadyin earlier papers such as e.g. [27], but it was Tsallis inhis seminal paper [2] who for the first time suggested togeneralise statistical mechanics using these entropici¼1forms. Again q 2Ris a real parameter, the entropicindex. As the reader immediately sees, the Tsallisentropies are different from the Re´nyi entropies: thereis no logarithm anymore. A relation between Re´nyiand Tsallis entropies is easily derived by writingXh ip q i ¼ 1 ðq 1ÞS ðTÞq ¼ exp ðq 1ÞS ðRÞq ; ð18Þiwhich impliesS ðTÞq ¼ 1 h i1 exp ðq 1ÞSðRÞ q : ð19Þq 1Apparently the Tsallis entropy is a monotonousfunction of the Re´nyi entropy, so any maximum ofthe Tsallis entropy will also be a maximum of theRe´nyi entropy and vice versa. But still, Tsallisentropies have many distinguished properties thatmake them a better candidate for generalising statisticalmechanics than, say, the Re´nyi entropies.One such property is concavity. One easily verifiesthat@S ðTÞq ¼@p iqq 1 pq 1i ; ð20Þ@ 2@p i @p jS ðTÞq ¼ qp q 2i d ij : ð21ÞThis means S ðTÞq is concave for all q 4 0 (convex for allq 5 0). This property is missing for the Re´nyientropies. Another such property is the so-calledLesche-stability, which is satisfied for the Tsallisentropies but not satisfied by the Re´nyi entropies (seeSection 3.3 for more details).The Tsallis entropies also contain the ShannonentropyS ¼XWi¼1p i ln p ias a special case. Letting q ! 1 we haveð22ÞS ðTÞ1¼ lim S ðTÞq ¼ S: ð23Þq!1As expected from a good information measure, theTsallis entropies take on their extremum for the uniformdistribution p i ¼ 1/W 8i. This extremum is given byS ðTÞq ¼ W1 q 1; ð24Þ1 qwhich, in the limit q ! 1, reproduces Boltzmann’scelebrated formula S ¼ 1nW.It is also useful to write down the definition ofTsallis entropies for a continuous probability density

Entropy Coherent and Entropy Convex Measures of Risk - Eurandom
Entropy Coherent and Entropy Convex Measures of Risk - Eurandom
Information in the - Racah Institute of Physics
Guide to measuring information and ... - unesdoc - Unesco
Measurement of Speed of Sound - Experimental Physics
Standard Model Measurements at the Tevatron - Physics Seminar ...
Physical Aspects Measuring Up - Arrow Trade Magazine!
Low Level Measurements Handbook - Institute of Solid State Physics
PHYSICS 1000 Measuring Device
Comprehensive Exam Information - Physics & Astronomy Graduate ...
Searching for new physics in CP Violation measurements at LHCb
Guide to Measuring Information and Communication Technologies
3 – Measuring CP-V phases - whepp-9 - Institute of Physics
Internet-based Measures of Physical Activity: Combining ... - FAO
Quantum Information and Many-Body Physics Workshop, UBC Dec ...
Entropy Measures vs. Algorithmic Information
Measurement of the Entropy and Critical Temperature of a ... - Physics
Measurement of the Entropy and Critical Temperature of a ... - Physics
Kolmogorov Complexity and Entropy Measures
DERIVATION OF MEASURES OF ENTROPY FROM DIRECTED ...
THE PHYSICS OF INFORMATION - Multiple Choices
Shannon entropy and mutual information for multivariate skew ...
Nonadditive entropy and nonextensive statistical mechanics - An ...
Entropy Measures and Unconditional Security in Cryptography
Entropy measures for biological signal analyses - Jianbo Gao