10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.42.10: Further exercises 51942.10 Further exercises⊲ Exercise 42.9. [3 ] Storing two memories.Two binary memories m <strong>and</strong> n (m i , n i ∈ {−1, +1}) are stored by Hebbianlearning in a Hopfield network usingw ij ={mi m j + n i n j for i ≠ j0 for i = j.(42.31)The biases b i are set to zero.The network is put in the state x = m. Evaluate the activation a i ofneuron i <strong>and</strong> show that in can be written in the forma i = µm i + νn i . (42.32)By comparing the signal strength, µ, with the magnitude of the noisestrength, |ν|, show that x = m is a stable state of the dynamics of thenetwork.The network is put in a state x differing in D places from m,x = m + 2d, (42.33)where the perturbation d satisfies d i ∈ {−1, 0, +1}. D is the numberof components of d that are non-zero, <strong>and</strong> for each d i that is non-zero,d i = −m i . Defining the overlap between m <strong>and</strong> n to beo mn =I∑m i n i , (42.34)i=1evaluate the activation a i of neuron i again <strong>and</strong> show that the dynamicsof the network will restore x to m if the number of flipped bits satisfiesD < 1 4 (I − |o mn| − 2). (42.35)How does this number compare with the maximum number of flippedbits that can be corrected by the optimal decoder, assuming the vectorx is either a noisy version of m or of n?Exercise 42.10. [3 ] Hopfield network as a collection of binary classifiers. This exerciseexplores the link between unsupervised networks <strong>and</strong> supervisednetworks. If a Hopfield network’s desired memories are all attractingstable states, then every neuron in the network has weights going to itthat solve a classification problem personal to that neuron. Take the setof memories <strong>and</strong> write them in the form x ′(n) , x (n)i, where x ′ denotes allthe components x i ′ for all i ′ ≠ i, <strong>and</strong> let w ′ denote the vector of weightsw ii ′, for i ′ ≠ i.Using what we know about the capacity of the single neuron, show thatit is almost certainly impossible to store more than 2I r<strong>and</strong>om memoriesin a Hopfield network of I neurons.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!