11.07.2015 Views

hamming

hamming

hamming

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

94 CHAPTER 13encoding. Toss a penny for each bit of the n bits of a message in the code book, and repeat for all Mmessages. There are nM tosses hencepossible code books, all books being of the same probability 1/2 nM . Of course the random process of makingthe code book means there is a chance there will be duplicates, and there may be code points which areclose to each other and hence will be a source of probable errors. What we have to prove is this does notoccur with a probability above any positive small level of error you care to pick—provided n is made largeenough.The decisive step is that Shannon averaged over all possible code books to find the average error! Wewill use the symbol Av[.] to mean average over the set of all possible random code books. Averaging overthe constant d of course gives the constant, and we have, since for the average each term is the same as anyother term in the sum,which can be increased (M–1 goes to M),For any particular message, when we average over all code books, the encoding runs through all possiblevalues, hence the average probability that a point is in the sphere is the ratio of the volume of the sphere tothe total volume of the space. The volume of the sphere iswhere s=Q+e 2

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!