12.07.2015 Views

Nonextensive Statistical Mechanics

Nonextensive Statistical Mechanics

Nonextensive Statistical Mechanics

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

2.1 Boltzmann–Gibbs Entropy 27the system is conservative, i.e., if the infinitely small d-dimensional hypervolumeremains constant with time, then it follows that ∑ dr=1 λ(r) 1= 0. An important particularclass of conservative systems is constituted by the so-called symplectic ones.For these, d is an even integer, and the Lyapunov exponents are coupled two bytwo as follows: λ (1)1=−λ (d)1≥ λ (2)1=−λ (d−1)1≥ ... ≥ λ (d +)1=−λ (d ++d 0 +1)1≥λ (d ++1)1= ... = λ (d ++d 0 )1= 0. Consistently, such systems have d + = d − andd 0 is an even integer. The most popular illustration of symplectic systems is theHamiltonian systems. They are conservative, which precisely is what the classicalLiouville theorem states!Do all these degrees of freedom contribute, as time evolves, to the erratic explorationof the phase-space? No, they do not. Only those associated with the d + positiveLyapunov exponents, and some of the d 0 vanishing ones, do. Consistently, it isonly these which contribute to our loss of information, as time evolves, about the locationin phase-space of a set of initial conditions. As we shall see, these remarks enablean intuitive understanding to the so-called Pesin identity, that we will soon state.Let us now address the interesting question of the BG entropy production astime t increases. More than one entropy production can be defined as a function oftime. Two basic choices are the so-called Kolmogorov–Sinai entropy (or KS entropyrate) [84], based on a single trajectory in phase-space, and the one associated to theevolution of an ensemble of initial conditions. We shall preferentially use here thelatter, because of its sensibly higher computational tractability. In fact, excepting forpathological cases, they both coincide.Let us schematically describe the Kolmogorov–Sinai entropy rate concept ormetric entropy [83, 84, 286]. We first partition the phase-space in two regions,noted A and B. Then we choose a generic initial condition (the final result willnot depend on this choice) and, applying the specific dynamics of the system atequal and finite time intervals τ, we generate a long string (infinitely long in principle),say ABBBAABBABAAA.... Then we analyze words of length l = 1.In our case, there are only two such words, namely A and B. The analysis consistsin running along the string a window whose width is l, and determiningthe probabilities p A and p B of the words A and B, respectively. Finally, we calculatethe entropy S BG (l = 1) = −p A ln p A − p B ln p B . Then we repeat forwords whose length equals l = 2. In our case, there are four such words, namelyAA, AB, BA, BB. Running along the string a l = 2 window letter by letter,we determine the probabilities p AA , p AB , p BA , p BB , hence the entropy S BG (l =2) =−p AA ln p AA − p AB ln p AB − p BA ln p BA − p BB ln p BB . Then we repeat forl = 3, 4,... and calculate the corresponding values for S BG (l). We then chooseanother two-partition, say A ′ and B ′ , and repeat the whole procedure. Then we doin principle for all possible two partitions. Then we go to three partitions, i.e., thealphabet will be now constituted by three letters, say A, B, and C. We repeat theprevious procedure for l = 1 (corresponding to the words A, B, C), then for l = 2(corresponding to the words AA, AB, AC, BA, BB, BC, CA, CB, CC), etc. Thenwe run windows with l = 3, 4,.... We then consider a different three-partition, sayA ′ , B ′ , and C ′ ...Then we consider four-partitions, and so on. Of all these entropieswe retain the supremum. In the appropriate limits of infinitely fine partitions and

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!