10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.388 30 — Efficient Monte Carlo Methodsg = gradE ( x ) ;E = findE ( x ) ;for l = 1:Lp = r<strong>and</strong>n ( size(x) ) ;H = p’ * p / 2 + E ;xnew = x ; gnew = g ;for tau = 1:Tau# set gradient using initial x# set objective function too# loop L times# initial momentum is Normal(0,1)# evaluate H(x,p)# make Tau ‘leapfrog’ stepsAlgorithm 30.1. Octave sourcecode for the Hamiltonian MonteCarlo method.p = p - epsilon * gnew / 2 ; # make half-step in pxnew = xnew + epsilon * p ; # make step in xgnew = gradE ( xnew ) ; # find new gradientp = p - epsilon * gnew / 2 ; # make half-step in pendforEnew = findE ( xnew ) ;Hnew = p’ * p / 2 + Enew ;dH = Hnew - H ;# find new value of H# Decide whether to acceptif ( dH < 0 ) accept = 1 ;elseif ( r<strong>and</strong>() < exp(-dH) ) accept = 1 ;else accept = 0 ;endifif ( accept )g = gnew ; x = xnew ; E = Enew ;endifendfor(a)Hamiltonian Monte Carlo10.50(c)10.50Simple MetropolisFigure 30.2. (a,b) HamiltonianMonte Carlo used to generatesamples from a bivariate Gaussianwith correlation ρ = 0.998. (c,d)For comparison, a simpler<strong>and</strong>om-walk Metropolis method,given equal computer time.-0.5-0.5-1-1 -0.5 0 0.5 11(b)0.50-0.5-1-1.5-1.5 -1 -0.5 0 0.5 1-11(d)0.50-0.5-1-1 -0.5 0 0.5 1-1 -0.5 0 0.5 1

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!