Views
3 years ago

Generalised information and entropy measures in physics

Generalised information and entropy measures in physics

500 C. Beckp(u) with R

500 C. Beckp(u) with R 11pðuÞ du ¼ 1, rather than a discrete set ofprobabilities p i with P i p i ¼ 1. In this case one definesS ðTÞq ¼ 1 Z þ1q 1 11dus ðspðuÞÞq; ð25Þwhere again s is a scale parameter that has the samedimension as the variable u. It is introduced for asimilar reason as before, namely to make the integral inEquation (25) dimensionless so that it can besubtracted from 1. For q ! 1 Equation (25) reducesto the Shannon entropyS ðTÞ1¼ S ¼Z 1dupðuÞ lnðspðuÞÞ:1ð26ÞA fundamental property of the Tsallis entropies isthe fact that they are not additive for independentsubsystems. In fact, they have no chance to do so, sincethey are different from the Re´nyi entropies, the onlysolution to Equation (9).To investigate this in more detail, let us considertwo independent subsystems I and II with probabilitiesp I i and p IIj , respectively. The probabilities of jointevents i, j for the combined system I,II are p ij ¼ p I i pII j .We may then consider the Tsallis entropy for the firstsystem, denoted as S I q , that of the second system,denoted as S IIq , and that of the joint system, denoted as. One hasS I;IIqS I;IIq¼ S I q þ SII q ðq 1ÞS I q SII q :ð27Þvanishes for q ¼ 1 only, i.e. for the case where the Tsallisentropy reduces to the Shannon entropy. Equation (27)is sometimes called the ‘pseudo-additivity’ property.Equation (27) has given rise to the name nonextensivestatistical mechanics. If we formulate ageneralised statistical mechanics based on maximisingTsallis entropies, then the (Tsallis) entropy of independentsystems is not additive (Figure 3). However, itturns out that for special types of correlated subsystems,the Tsallis entropies do become additive if thesubsystems are put together [28]. This means, for thesetypes of correlated complex systems a description interms of Tsallis entropies in fact can make thingssimpler as compared to using the Shannon entropy,which is non-additive for correlated subsystems.2.3. Landsberg–Vedral entropyLet us continue with a few other examples of generalisedinformation measures. Consider!S ðLÞq ¼ 1 1Pq 1 W1 : ð32Þi¼1 pq iThis measure was studied by Landsberg and Vedral[29]. One immediately sees that the Landsberg–Vedralentropy is related to the Tsallis entropy S ðTÞq byS ðLÞq ¼ SðTÞ qP Wi¼1 pq i; ð33ÞProof of Equation (27): We may writeXðp I i Þq ¼ 1 ðq 1ÞS I q ; ð28ÞiXXp q ij ¼ X ðp X I i Þqi;j i jjðp IIj Þ q ¼ 1 ðq 1ÞS IIq ;ð29Þðp IIj Þ q ¼ 1 ðq 1ÞS I;IIq : ð30ÞFrom Equations (28) and (29) it also follows thatXðp X I i Þq ðp IIj Þ q ¼ 1 ðq 1ÞS I q ðq 1ÞS IIqjiþðq1Þ 2 S I q SII q :ð31ÞCombining Equations (30) and (31) one ends up withEquation (27). ¤Apparently, if we put together two independentsubsystems then the Tsallis entropy is not additive butthere is a correction term proportional to q71, whichcolour inprint & onlineFigure 3. If the nonadditive entropies S q are used tomeasure information, then the information contents of twosystems I, II (blue) that are put together is not equal to thesum of the information contents of the isolated singlesystems. In other words, there is always an interactionbetween the subsystems (red).

Contemporary Physics 501and hence S ðLÞq is sometimes also called normalisedTsallis entropy. S ðLÞq also contains the Shannon entropyas a special caselimq!1 SðLÞ q ¼ S ð34Þand one readily verifies that it also satisfies apseudo-additivity condition for independent systems,namelyS ðLÞI;IIq¼ S ðLÞIqþ Sq ðLÞII þðq 1ÞSqðLÞIS ðLÞIIq : ð35ÞThis means that in the pseudo-additivity relation (27)the role of (q71) and 7(q71) is exchanged.2.4. Abe entropyAbe [30] introduced a kind of symmetric modificationof the Tsallis entropy, which is invariant under theexchange q ! q 71 . This is given byS Abeq ¼ X ip q i p q 1iq q 1 : ð36ÞThis symmetric choice in q and q 71 is inspired by thetheory of quantum groups which often exhibitsinvariance under the ‘duality transformation’ q ! q 71 .Like Tsallis entropy, the Abe entropy is also concave.In fact, it is related to the Tsallis entropy S T q byS Abeq ¼ ðq 1ÞST q ðq 1 1ÞS T q 1q q 1 : ð37ÞClearly the relevant range of q is now just the unitinterval (0,1], due to the symmetry q ! q 71 : Replacingq by q 71 in Equation (36) does not change anything.2.5. Kaniadakis entropyThe Kaniadakis entropy (also called k-entropy) isdefined by the following expression [4]S k ¼X ip 1þki p 1 ik2k: ð38ÞAgain this is a kind of deformed Shannon entropy,which reduces to the original Shannon entropy fork ¼ 0. We also note that for small k, and by writingq ¼ 1 þ k, q 71 17k, the Kaniadakis entropy approachesthe Abe entropy. Kaniadakis was motivatedto introduce this entropic form by special relativity:the relativistic sum of two velocities of particles ofmass m in special relativity satisfies a similar relation asthe Kaniadakis entropy does, identifying k ¼ 1/mc.Kaniadakis entropies are also concave and Leschestable (see Section 3.3).2.6. Sharma–Mittal entropiesThese are two-parameter families of entropic forms[31]. They can be written in the formS k;r ¼ p r p k i pik i: ð39Þ2kX iInterestingly, they contain many of the entropiesmentioned so far as special cases. The Tsallis entropyis obtained for r ¼ k and q ¼ 1–2k. The Kaniadakisentropy is obtained for r ¼ 0. The Abe entropy isobtained for k ¼ 1 2 ðq q 1 Þ and r ¼ 1 2 ðq þ q 1 Þ 1.The Sharma–Mittal entropies are concave and Leschestable.3. Selecting a suitable information measure3.1. Axiomatic foundationsThe Khinchin axioms apparently are the right axiomsto obtain the Shannon entropy in a unique way, butthis concept may be too narrow-minded if one wants todescribe general complex systems. In physics, forexample, one may be interested in nonequilibriumsystems with a stationary state, glassy systems,long transient behaviour in systems with long-rangeinteractions, systems with multifractal phase spacestructure etc. In all these cases one should be openmindedto allow for generalisations of Axiom 4, since itis this axiom that is least obvious in the givencircumstances.Abe [32] has shown that the Tsallis entropy followsuniquely (up to an arbitrary multiplicative constant)from the following generalised version of the Khinchinaxioms. Axioms 1–3 are kept, and Axiom 4 is replacedby the following more general version:New Axiom 4S I;IIq¼ S I q þ SIIjI q ðq 1ÞS I q SIIjI q : ð40ÞHere S IIjIq is the conditional entropy formedwith the conditional probabilities p(jji) and averagedover all states i using the so-called escort distributionsP i :S IIjIq¼ X iP i S q ðfpðjjiÞgÞ:ð41ÞEscort distributions P i were introduced quite generallyin [1] and are defined for any given probabilitydistribution p i byP i ¼Ppq i: ð42Þi pq i

Entropy Coherent and Entropy Convex Measures of Risk - Eurandom
Entropy Coherent and Entropy Convex Measures of Risk - Eurandom
Information in the - Racah Institute of Physics
Guide to measuring information and ... - unesdoc - Unesco
Measurement of Speed of Sound - Experimental Physics
Standard Model Measurements at the Tevatron - Physics Seminar ...
PHYSICS 1000 Measuring Device
Physical Aspects Measuring Up - Arrow Trade Magazine!
Internet-based Measures of Physical Activity: Combining ... - FAO
Low Level Measurements Handbook - Institute of Solid State Physics
Comprehensive Exam Information - Physics & Astronomy Graduate ...
Quantum Information and Many-Body Physics Workshop, UBC Dec ...
Searching for new physics in CP Violation measurements at LHCb
Guide to Measuring Information and Communication Technologies
3 – Measuring CP-V phases - whepp-9 - Institute of Physics
Entropy Measures vs. Algorithmic Information
Measurement of the Entropy and Critical Temperature of a ... - Physics
Measurement of the Entropy and Critical Temperature of a ... - Physics
Kolmogorov Complexity and Entropy Measures
DERIVATION OF MEASURES OF ENTROPY FROM DIRECTED ...
THE PHYSICS OF INFORMATION - Multiple Choices
Shannon entropy and mutual information for multivariate skew ...
Nonadditive entropy and nonextensive statistical mechanics - An ...
Entropy Measures and Unconditional Security in Cryptography
Entropy of phase measurement