10.07.2015 Views

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

Information Theory, Inference, and Learning ... - Inference Group

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.6.2: Arithmetic codes 113u := 0.0v := 1.0p := v − ufor n = 1 to N {Compute the cumulative probabilities Q n <strong>and</strong> R n (6.2, 6.3)v := u + pR n (x n | x 1 , . . . , x n−1 )u := u + pQ n (x n | x 1 , . . . , x n−1 )p := v − u}Algorithm 6.3. Arithmetic coding.Iterative procedure to find theinterval [u, v) for the stringx 1 x 2 . . . x N .Formulae describing arithmetic codingThe process depicted in figure 6.2 can be written explicitly as follows. Theintervals are defined in terms of the lower <strong>and</strong> upper cumulative probabilitiesQ n (a i | x 1 , . . . , x n−1 )R n (a i | x 1 , . . . , x n−1 )≡≡∑i−1i ′ = 1P (x n = a i ′ | x 1 , . . . , x n−1 ), (6.2)i∑P (x n = a i ′ | x 1 , . . . , x n−1 ). (6.3)i ′ = 1As the nth symbol arrives, we subdivide the n−1th interval at the points definedby Q n <strong>and</strong> R n . For example, starting with the first symbol, the intervals ‘a 1 ’,‘a 2 ’, <strong>and</strong> ‘a I ’ area 1 ↔ [Q 1 (a 1 ), R 1 (a 1 )) = [0, P (x 1 = a 1 )), (6.4)a 2 ↔ [Q 1 (a 2 ), R 1 (a 2 )) = [P (x = a 1 ), P (x = a 1 ) + P (x = a 2 )) , (6.5)<strong>and</strong>a I ↔ [Q 1 (a I ), R 1 (a I )) = [P (x 1 = a 1 ) + . . . + P (x 1 = a I−1 ), 1.0) . (6.6)Algorithm 6.3 describes the general procedure.To encode a string x 1 x 2 . . . x N , we locate the interval corresponding tox 1 x 2 . . . x N , <strong>and</strong> send a binary string whose interval lies within that interval.This encoding can be performed on the fly, as we now illustrate.Example: compressing the tosses of a bent coinImagine that we watch as a bent coin is tossed some number of times (cf.example 2.7 (p.30) <strong>and</strong> section 3.2 (p.51)). The two outcomes when the coinis tossed are denoted a <strong>and</strong> b. A third possibility is that the experiment ishalted, an event denoted by the ‘end of file’ symbol, ‘✷’. Because the coin isbent, we expect that the probabilities of the outcomes a <strong>and</strong> b are not equal,though beforeh<strong>and</strong> we don’t know which is the more probable outcome.EncodingLet the source string be ‘bbba✷’. We pass along the string one symbol at atime <strong>and</strong> use our model to compute the probability distribution of the next

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!