13.07.2015 Views

Algorithmic Thermodynamics

Algorithmic Thermodynamics

Algorithmic Thermodynamics

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

1 IntroductionMany authors [1, 6, 9, 12, 16, 24, 26, 28] have discussed the analogy betweenalgorithmic entropy and entropy as defined in statistical mechanics: that is,the entropy of a probability measure p on a set X. It is perhaps insufficientlyappreciated that algorithmic entropy can be seen as a special case of the entropyas defined in statistical mechanics. We describe how to do this in Section 3.This allows all the basic techniques of thermodynamics to be imported toalgorithmic information theory. The key idea is to take X to be some version of‘the set of all programs that eventually halt and output a natural number’, andlet p be a Gibbs ensemble on X. A Gibbs ensemble is a probability measure thatmaximizes entropy subject to constraints on the mean values of some observables— that is, real-valued functions on X.In most traditional work on algorithmic entropy, the relevant observableis the length of the program. However, much of the interesting structure ofthermodynamics only becomes visible when we consider several observables.When X is the set of programs that halt and output a natural number, someother important observables include the output of the program and logarithmof its runtime. So, in Section 4 we illustrate how ideas from thermodynamicscan be applied to algorithmic information theory using these three observables.To do this, we consider a Gibbs ensemble of programs which maximizesentropy subject to constraints on:• E, the expected value of the logarithm of the program’s runtime (whichwe treat as analogous to the energy of a container of gas),• V , the expected value of the length of the program (analogous to thevolume of the container), and• N, the expected value of the program’s output (analogous to the numberof molecules in the gas).This measure is of the formp = 1 (x)−δN(x)e−βE(x)−γVZfor certain numbers β, γ, δ, where the normalizing factorZ = ∑ x∈X−βE(x)−γV (x)−δN(x)eis called the ‘partition function’ of the ensemble. The partition function reducesto Chaitin’s number Ω when β = 0, γ = ln 2 and δ = 0. This number is uncomputable[6]. However, we show that the partition function Z is computablewhen β > 0, γ ≥ ln 2, and δ ≥ 0.We derive an algorithmic analogue of the basic thermodynamic relationHere:dE = TdS − PdV + µdN.2

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!