01.08.2013 Views

Information Theory, Inference, and Learning ... - MAELabs UCSD

Information Theory, Inference, and Learning ... - MAELabs UCSD

Information Theory, Inference, and Learning ... - MAELabs UCSD

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/0521642981<br />

You can buy this book for 30 pounds or $50. See http://www.inference.phy.cam.ac.uk/mackay/itila/ for links.<br />

9.7: Intuitive preview of proof 153<br />

0<br />

1<br />

✏ ✲ ✏✏✶<br />

B B A<br />

✏ ✲ ✏✏✶<br />

E E DC<br />

Y<br />

Z✏<br />

✲Z<br />

-<br />

✏✏✶<br />

✏ ✲<br />

.<br />

✏✏✶<br />

H H<br />

I<br />

GF<br />

0 1<br />

00<br />

10<br />

01<br />

11<br />

00<br />

10<br />

01<br />

11<br />

0000<br />

1000<br />

0100<br />

1100<br />

0010<br />

1010<br />

0110<br />

1110<br />

0001<br />

1001<br />

0101<br />

1101<br />

0011<br />

1011<br />

0111<br />

1111<br />

A<br />

B<br />

C<br />

D<br />

E<br />

F<br />

G<br />

H<br />

I<br />

J<br />

K<br />

L<br />

M<br />

N<br />

O<br />

P<br />

Q<br />

R<br />

S<br />

T<br />

U<br />

V<br />

W<br />

X<br />

Y<br />

Z<br />

-<br />

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z -<br />

0000<br />

1000<br />

0100<br />

1100<br />

0010<br />

1010<br />

0110<br />

1110<br />

0001<br />

1001<br />

0101<br />

1101<br />

0011<br />

1011<br />

0111<br />

1111<br />

N = 1 N = 2 N = 4<br />

How does this translate into the terms of the theorem? The following table<br />

explains.<br />

The theorem How it applies to the noisy typewriter<br />

Associated with each discrete<br />

memoryless channel, there is a<br />

non-negative number C.<br />

For any ɛ > 0 <strong>and</strong> R < C, for large<br />

enough N,<br />

there exists a block code of length N <strong>and</strong><br />

rate ≥ R<br />

The capacity C is log 2 9.<br />

Figure 9.5. A non-confusable<br />

subset of inputs for the noisy<br />

typewriter.<br />

Figure 9.6. Extended channels<br />

obtained from a binary symmetric<br />

channel with transition<br />

probability 0.15.<br />

No matter what ɛ <strong>and</strong> R are, we set the blocklength N to 1.<br />

The block code is {B, E, . . . , Z}. The value of K is given by<br />

2 K = 9, so K = log 2 9, <strong>and</strong> this code has rate log 2 9, which is<br />

greater than the requested value of R.<br />

<strong>and</strong> a decoding algorithm, The decoding algorithm maps the received letter to the nearest<br />

letter in the code;<br />

such that the maximal probability of<br />

block error is < ɛ.<br />

9.7 Intuitive preview of proof<br />

Extended channels<br />

the maximal probability of block error is zero, which is less<br />

than the given ɛ.<br />

To prove the theorem for any given channel, we consider the extended channel<br />

corresponding to N uses of the channel. The extended channel has |AX| N<br />

possible inputs x <strong>and</strong> |AY | N possible outputs. Extended channels obtained<br />

from a binary symmetric channel <strong>and</strong> from a Z channel are shown in figures<br />

9.6 <strong>and</strong> 9.7, with N = 2 <strong>and</strong> N = 4.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!