06.03.2017 Views

Mathematics for Computer Science

e9ck2Ar

e9ck2Ar

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

“mcs” — 2017/3/3 — 11:21 — page 879 — #887<br />

20.6. Sums of Random Variables 879<br />

Chebyshev’s Theorem gives us the stronger result that<br />

PrŒjT ExŒT j c T 1<br />

c 2 :<br />

The Chernoff Bound gives us an even stronger result, namely, that <strong>for</strong> any c > 0,<br />

PrŒT ExŒT c ExŒT e .c ln.c/ cC1/ ExŒT :<br />

In this case, the probability of exceeding the mean by c ExŒT decreases as an<br />

exponentially small function of the deviation.<br />

By considering the random variable n T , we can also use the Chernoff Bound<br />

to prove that the probability that T is much lower than ExŒT is also exponentially<br />

small.<br />

20.6.8 Murphy’s Law<br />

If the expectation of a random variable is much less than 1, then Markov’s Theorem<br />

implies that there is only a small probability that the variable has a value of 1 or<br />

more. On the other hand, a result that we call Murphy’s Law 8 says that if a random<br />

variable is an independent sum of 0–1-valued variables and has a large expectation,<br />

then there is a huge probability of getting a value of at least 1.<br />

Theorem 20.6.4 (Murphy’s Law). Let A 1 , A 2 , . . . , A n be mutually independent<br />

events. Let T i be the indicator random variable <strong>for</strong> A i and define<br />

T WWD T 1 C T 2 C C T n<br />

to be the number of events that occur. Then<br />

PrŒT D 0 e ExŒT :<br />

8 This is in reference and deference to the famous saying that “If something can go wrong, it<br />

probably will.”

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!