08.02.2013 Views

Bernal S D_2010.pdf - University of Plymouth

Bernal S D_2010.pdf - University of Plymouth

Bernal S D_2010.pdf - University of Plymouth

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

4.4. F^DFORWARD PROCESSING<br />

u<br />

I<br />

M,^/N<br />

true vs. random<br />

distributions (baseline)<br />

Figure 4. J3: Kullback-Leibler divergence belween the true and Iheapproximale likelihood distrihuiion<br />

for dilfereni values oi' M,,,,^. averaged over 5(X) trials. The Kullhack-<br />

Leibler (K-l.) divergence, on the y-axis, measures the eross-correlation belween<br />

an approximale disirihuiion and the true disirihution. This method cannot be<br />

considered a distance measure, as ii is mil symmetric, bui has been used extensively<br />

U) measure the goodness <strong>of</strong> fii between lw{) discrete probability distributions<br />

(Frislon and Kiebel 2009, Winn and Bishop 2005, Hinlon el al, 2006) The<br />

X-axis shows the coeflicieni Mm„,/A'. i.e. the percentage <strong>of</strong> A messages <strong>of</strong> the total<br />

that are used in the approximation. Three dilierenl values <strong>of</strong> N are shown: 20<br />

(blue lines), 50 (red lines) and 100 (green lines). The dotted horizontal line shows<br />

ihe K-L divergence between the true and a random distribution, which serves as<br />

a baseline to compare the goodness <strong>of</strong> fit <strong>of</strong> die approximate dislribufion.<br />

175

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!