21.03.2013 Views

Problem - Kevin Tafuro

Problem - Kevin Tafuro

Problem - Kevin Tafuro

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

the byte is even, he reduces the number of guesses necessary to 27 (128), in which<br />

case the byte has only 7 bits of entropy.<br />

We can have fractional bits of entropy. If we have one bit, and it has a 25% chance of<br />

being a 0 and a 75% chance of being a 1, the attacker can do 50% better at guessing<br />

it than if the bit were fully entropic. Therefore, there is half the amount of entropy in<br />

that bit.<br />

In public key cryptography, n-bit keys contain far fewer than n bits of<br />

entropy. That is because there are not 2 n possible keys. For example,<br />

in RSA, we are more or less limited by the number of primes that are n<br />

bits in size.<br />

Random numbers with lots of entropy are difficult to come by, especially on a deterministic<br />

computer. Therefore, it is generally far more practical to gather enough<br />

entropy to securely seed a cryptographic pseudo-random number generator. Several<br />

issues arise in doing so.<br />

First, how much entropy do you need to seed a cryptographic generator securely?<br />

The short answer is that you should try to give as much entropy as the random number<br />

generator can accept. The entropy you get sets the maximum security level of<br />

your data protected with that entropy, directly or indirectly. For example, suppose<br />

you use 256-bit AES keys, but chose your key with a PRNG seeded with 56 bits of<br />

entropy. Any data encrypted with the 256-bit AES key would then be no more secure<br />

than it would have been had the data been encrypted with a 56-bit DES key.<br />

Then again, it’s incredibly hard to figure out how much entropy a piece of data contains,<br />

and often, estimates that people believe to be conservative are actually large<br />

overestimates. For example, the digits of π appear to be a completely random<br />

sequence that should pass any statistical test for randomness with flying colors. Yet<br />

they are also completely predictable.<br />

We recommend that if you have done a lot of work to figure out how much entropy<br />

is in a piece of data and you honestly think you have 160 bits there, you still might<br />

want to divide your estimate by a factor of 4 to 8 to be conservative.<br />

Because entropy is so easy to overestimate, you should generally cryptographically<br />

postprocess any entropy collected (a process known as whitening) before using it. We<br />

discuss whitening in Recipe 11.16.<br />

Second, most cryptographic pseudo-random number generators take a fixed-size<br />

seed, and you want to maximize the entropy in that seed. However, when collecting<br />

entropy, it is usually distributed sparsely through a large amount of data. We discuss<br />

methods for turning data with entropy into a seed in Recipe 11.16. If you have<br />

an entropy source that is supposed to produce good random numbers (such as a<br />

hardware generator), you should test the data as discussed in Recipe 11.18.<br />

Determining What Kind of Random Numbers to Use | 571<br />

This is the Title of the Book, eMatter Edition<br />

Copyright © 2007 O’Reilly & Associates, Inc. All rights reserved.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!