10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.27Laplace’s MethodThe idea behind the Laplace approximation is simple. We assume that anunnormalized probability density P ∗ (x), whose normalizing constant∫Z P ≡ P ∗ (x) dx (27.1)P ∗ (x)is of interest, has a peak at a point x 0 .P ∗ (x) around this peak:We Taylor-exp<strong>and</strong> the logarithm ofln P ∗ (x) ≃ ln P ∗ (x 0 ) − c 2 (x − x 0) 2 + · · · , (27.2)ln P ∗ (x)wherec = − ∂2∂x 2 ln P ∗ (x)∣ . (27.3)x=x0We then approximate P ∗ (x) by an unnormalized Gaussian,[Q ∗ (x) ≡ P ∗ (x 0 ) exp − c 2 (x − x 0) 2] , (27.4)ln P ∗ (x)& ln Q ∗ (x)<strong>and</strong> we approximate the normalizing constant Z P by the normalizing constantof this Gaussian,Z Q = P ∗ (x 0 )√2πc . (27.5)We can generalize this integral to approximate Z P for a density P ∗ (x) overa K-dimensional space x. If the matrix of second derivatives of − ln P ∗ (x) atthe maximum x 0 is A, defined by:A ij = −∂2ln P ∗ (x)∂x i ∂x j∣ , (27.6)x=x0P ∗ (x)& Q ∗ (x)so that the expansion (27.2) is generalized toln P ∗ (x) ≃ ln P ∗ (x 0 ) − 1 2 (x − x 0) T A(x − x 0 ) + · · · , (27.7)then the normalizing constant can be approximated by:√Z P ≃ Z Q = P ∗ 1(2π)(x 0 ) √ = P ∗ K(x 0 )det 12π A det A . (27.8)Predictions can be made using the approximation Q. Physicists also call thiswidely-used approximation the saddle-point approximation.341

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!