01.08.2013 Views

Information Theory, Inference, and Learning ... - MAELabs UCSD

Information Theory, Inference, and Learning ... - MAELabs UCSD

Information Theory, Inference, and Learning ... - MAELabs UCSD

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981<br />

You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.<br />

About Chapter 31<br />

Some of the neural network models that we will encounter are related to Ising<br />

models, which are idealized magnetic systems. It is not essential to underst<strong>and</strong><br />

the statistical physics of Ising models to underst<strong>and</strong> these neural networks, but<br />

I hope you’ll find them helpful.<br />

Ising models are also related to several other topics in this book. We will<br />

use exact tree-based computation methods like those introduced in Chapter<br />

25 to evaluate properties of interest in Ising models. Ising models offer crude<br />

models for binary images. And Ising models relate to two-dimensional constrained<br />

channels (cf. Chapter 17): a two-dimensional bar-code in which a<br />

black dot may not be completely surrounded by black dots, <strong>and</strong> a white dot<br />

may not be completely surrounded by white dots, is similar to an antiferromagnetic<br />

Ising model at low temperature. Evaluating the entropy of this Ising<br />

model is equivalent to evaluating the capacity of the constrained channel for<br />

conveying bits.<br />

If you would like to jog your memory on statistical physics <strong>and</strong> thermodynamics,<br />

you might find Appendix B helpful. I also recommend the book by<br />

Reif (1965).<br />

399

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!