10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.22.5: Further exercises 307⊲ Exercise 22.8. [2 ] (a) A photon counter is pointed at a remote star for oneminute, in order to infer the brightness, i.e., the rate of photonsarriving at the counter per minute, λ. Assuming the number ofphotons collected r has a Poisson distribution with mean λ,P (r | λ) = exp(−λ) λrr! , (22.30)what is the maximum likelihood estimate for λ, given r = 9? Finderror bars on ln λ.(b) Same situation, but now we assume that the counter detects notonly photons from the star but also ‘background’ photons. Thebackground rate of photons is known to be b = 13 photons perminute. We assume the number of photons collected, r, has a Poissondistribution with mean λ+b. Now, given r = 9 detected photons,what is the maximum likelihood estimate for λ? Comment on thisanswer, discussing also the Bayesian posterior distribution, <strong>and</strong> the‘unbiased estimator’ of sampling theory, ˆλ ≡ r − b.Exercise 22.9. [2 ] A bent coin is tossed N times, giving N a heads <strong>and</strong> N b tails.Assume a beta distribution prior for the probability of heads, p, forexample the uniform distribution. Find the maximum likelihood <strong>and</strong>maximum a posteriori values of p, then find the maximum likelihood<strong>and</strong> maximum a posteriori values of the logit a ≡ ln[p/(1−p)]. Comparewith the predictive distribution, i.e., the probability that the next tosswill come up heads.⊲ Exercise 22.10. [2 ] Two men looked through prison bars; one saw stars, theother tried to infer where the window frame was.From the other side of a room, you look through a window <strong>and</strong> see starsat locations {(x n , y n )}. You can’t see the window edges because it is totallydark apart from the stars. Assuming the window is rectangular <strong>and</strong>that the visible stars’ locations are independently r<strong>and</strong>omly distributed,what are the inferred values of (x min , y min , x max , y max ), according tomaximum likelihood? Sketch the likelihood as a function of x max , forfixed x min , y min , <strong>and</strong> y max .(x min, y min)⋆⋆⋆⋆⋆⋆(x max, y max)⊲ Exercise 22.11. [3 ] A sailor infers his location (x, y) by measuring the bearingsof three buoys whose locations (x n , y n ) are given on his chart. Let thetrue bearings of the buoys be θ n . Assuming that his measurement ˜θ n ofeach bearing is subject to Gaussian noise of small st<strong>and</strong>ard deviation σ,what is his inferred location, by maximum likelihood?The sailor’s rule of thumb says that the boat’s position can be taken tobe the centre of the cocked hat, the triangle produced by the intersectionof the three measured bearings (figure 22.8). Can you persuade him thatthe maximum likelihood answer is better?⊲ Exercise 22.12. [3, p.310] Maximum likelihood fitting of an exponential-familymodel.Assume that a variable x comes from a probability distribution of theform( )P (x | w) = 1 ∑Z(w) exp w k f k (x) , (22.31)k❜(x 3, y 3)◗ ◗◗◗◗◗ ❆ ❆❆ ❆❆❜ ❆❆(x 1, y 1)❆❆❜(x 2, y 2)Figure 22.8. The st<strong>and</strong>ard way ofdrawing three slightly inconsistentbearings on a chart produces atriangle called a cocked hat.Where is the sailor?

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!