12.07.2015 Views

Alignment and Atlases

Alignment and Atlases

Alignment and Atlases

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

-72-Mutual Information• Entropy of 2D histogram H({r ij }) = –Σ ij r ij log 2 (r ij ) Number of bits needed to encode value pairs (i,j)• Mutual Information between two distributions Marginal (1D) histograms {p i } <strong>and</strong> {q j } MI = H({p i }) + H({q j }) - H({r ij }) Number of bits required to encode 2 values separately minus number of bitsrequired to encode them together (as a pair) If 2D histogram is independent (r ij = p i ⋅q j ) then MI = 0 = no gain from joint encoding• 3dAllineate minimizes E[J,I] = –MI(J,I) with -cost mi

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!