13.06.2015 Views

The Physical Basis of The Direction of Time (The Frontiers ...

The Physical Basis of The Direction of Time (The Frontiers ...

The Physical Basis of The Direction of Time (The Frontiers ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

48 3 <strong>The</strong> <strong>The</strong>rmodynamical Arrow <strong>of</strong> <strong>Time</strong><br />

always an appropriate definition <strong>of</strong> entropy. It will indeed turn out to be insufficient<br />

when correlations between particles become essential, as is the case,<br />

for example, for real gases or solid bodies. Taking them into account requires<br />

more general concepts, which were first proposed by Gibbs. His approach will<br />

also allow us to formulate the exact ensemble dynamics in Γ -space, although<br />

it cannot yet explain the origin <strong>of</strong> the thermodynamical arrow <strong>of</strong> time (that<br />

is, <strong>of</strong> the low-entropy initial conditions).<br />

3.1.2 Γ -Space Dynamics and Gibbs’ Entropy<br />

In the preceding section, Boltzmann’s smooth phase space density ρ µ was justified<br />

by means <strong>of</strong> small uncertainties in particle positions and momenta. It<br />

describes an infinite number (a continuum) <strong>of</strong> possible single-particle states,<br />

for example each particle represented by a small volume element ∆V µ .Anobjective<br />

(‘real’) state would instead be described by a point (or a δ-distribution)<br />

in Γ -space, or by a sum over Nδ-functions in µ-space. This would then lead<br />

to an infinite value <strong>of</strong> Boltzmann’s H-functional, or negative infinite entropy.<br />

However, the finite value <strong>of</strong> S µ [ρ µ ], derived from the smooth µ-space distribution,<br />

is not just a measure <strong>of</strong> this arbitrary smoothing procedure (for<br />

example representing the size <strong>of</strong> the volume elements ∆V µ ). If N points are<br />

replaced by small but overlapping volume elements, this leads to a smooth<br />

distribution ρ µ whose width reflects that <strong>of</strong> the discrete (real) distribution<br />

<strong>of</strong> particles. <strong>The</strong>refore, S µ characterizes the real physical state. <strong>The</strong> formal<br />

‘renormalization <strong>of</strong> entropy’, which is part <strong>of</strong> this smoothing procedure, adds<br />

an infinite positive contribution to the infinite negative entropy corresponding<br />

to a point in such a way that the finite result S µ [ρ µ ]isphysically meaningful.<br />

<strong>The</strong> ‘representative ensemble’ obtained in this way defines a finite measure <strong>of</strong><br />

probability (in the sense <strong>of</strong> the introduction to this chapter) for the N! points<br />

in Γ -space. It depends only slightly on the precise smoothing conditions, provided<br />

the discrete µ-space distribution is already smooth in the mean.<br />

<strong>The</strong> ensemble concept introduced by Josiah Willard Gibbs (1902) differs<br />

from Boltzmann’s at the very outset. He considered probability densities<br />

ρ Γ (p, q) with ∫ ρ Γ (p, q)dp dq = 1 – from now on writing p := p 1 ,...,p 3N ,<br />

q := q 1 ,...,q 3N and dp dq := d 3N pd 3N q for short, which are meant to describe<br />

incomplete information (‘ignorance’) about microscopic degrees <strong>of</strong> freedom.<br />

For example, a probability density may characterize a macroscopic (incomplete)<br />

preparation procedure. Boltzmann’s H-functional is then replaced by<br />

Gibbs’ formally analogous extension in phase η :<br />

∫<br />

η[ρ Γ ]:=ln ρ Γ = ρ Γ (p, q)lnρ Γ (p, q)dp dq . (3.17)<br />

It leads generically to a finite ensemble entropy S Γ := −kη[ρ Γ ]. For a probability<br />

density that is constant on a phase space volume element <strong>of</strong> size ∆V Γ (while<br />

vanishing elsewhere), one has η[ρ Γ ]=− ln ∆V Γ . <strong>The</strong> entropy S Γ = k ln ∆V Γ

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!