11.07.2015 Views

Cryptography - Sage

Cryptography - Sage

Cryptography - Sage

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

CHAPTERFOURInformation TheoryInformation theory concerns the measure of information contained in data. The security ofa cryptosystem is determined by the relative content of the key, plaintext, and ciphertext.For our purposes a discrete probability space – a finite set X together with a probabilityfunction on X – will model a language. Such probability spaces may represent the spaceof keys, of plaintext, or of ciphertext, and which we may refer to as a space or a language,and to an element x as a message. The probability function P : X → R is defined to be anon-negative real-valued function on X such that∑P (x) = 1.x∈XFor a naturally occuring language, we reduce to a finite model of it by considering finitesets consisting of strings of length N in that language. If X models the English language,then the function P assigns to each string the probability of its appearance, among allstrings of length N, in the English language.4.1 EntropyThe entropy of a given space with probability function P is a measure of the informationcontent of the language. The formal definition of entropy isH(X) = ∑ x∈XP (x) log 2 (P (x) −1 ).For 0 < P (x) < 1, the value log 2 (P (x) −1 ) is a positive real number, and we defineP (x) log 2 (P (x) −1 ) = 0 when P (x) = 0. The following exercise justifies this definition.Exercise. Show that the limitlim x log 2(x −1 ) = 0.x→0 +31

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!