12.07.2015 Views

Simple Nature - Light and Matter

Simple Nature - Light and Matter

Simple Nature - Light and Matter

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

f / A phase space for a singleatom in one dimension, takingmomentum into account.g / Ludwig Boltzmann’s tomb,inscribed with his equation forentropy.makes the phase space three-dimensional. Here, the R = 1 <strong>and</strong> 2states are three times more likely than R = 0 <strong>and</strong> 3. Four atomswould require a four-dimensional phase space, which exceeds ourability to visualize. Although our present example doesn’t requireit, a phase space can describe momentum as well as position, asshown in figure f. In general, a phase space for a monoatomic gashas six dimensions per atom (one for each coordinate <strong>and</strong> one foreach momentum component).5.4.3 Microscopic definitions of entropy <strong>and</strong> temperatureTwo more issues need to be resolved in order to make a microscopicdefinition of entropy.First, if we defined entropy as the number of possible states,it would be a multiplicative quantity, not an additive one: if anice cube in a glass of water has M 1 states available to it, <strong>and</strong> thenumber of states available to the water is M 2 , then the number ofpossible states of the whole system is the product M 1 M 2 . To getaround this problem, we take the natural logarithm of the numberof states, which makes the entropy additive because of the propertyof the logarithm ln(M 1 M 2 ) = ln M 1 + ln M 2 .The second issue is a more trivial one. The concept of entropywas originally invented as a purely macroscopic quantity, <strong>and</strong> themacroscopic definition ∆S = Q/T , which has units of J/K, has adifferent calibration than would result from defining S = ln M. Thecalibration constant we need turns out to be simply the Boltzmannconstant, k.Microscopic definition of entropy: The entropy of a system isS = k ln M, where M is the number of available states. 3This also leads to a more fundamental definition of temperature.Two systems are in thermal equilibrium when they have maximizedtheir combined entropy through the exchange of energy. Here theenergy possessed by one part of the system, E 1 or E 2 , plays thesame role as the variable R in the examples of free expansion above.A maximum of a function occurs when the derivative is zero, so themaximum entropy occurs whend(S 1 + S 2 )dE 1= 0 .We assume the systems are only able to exchange heat energy witheach other, dE 1 = − dE 2 , sodS 1dE 1= dS 2dE 2,<strong>and</strong> since the energy is being exchanged in the form of heat we canmake the equations look more familiar if we write dQ for an amount3 This is the same relation as the one on Boltzmann’s tomb, just in a slightlydifferent notation.318 Chapter 5 Thermodynamics

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!