06.03.2017 Views

Mathematics for Computer Science

e9ck2Ar

e9ck2Ar

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

“mcs” — 2017/3/3 — 11:21 — page 824 — #832<br />

824<br />

Chapter 19<br />

Random Variables<br />

his own hat back is 1=n. There are many different probability distributions of hat<br />

permutations with this property, so we don’t know enough about the distribution of<br />

G to calculate its expectation directly using equation (19.2) or (19.3). But linearity<br />

of expectation lets us sidestep this issue.<br />

We’ll use a standard, useful trick to apply linearity, namely, we’ll express G as<br />

a sum of indicator variables. In particular, let G i be an indicator <strong>for</strong> the event that<br />

the ith man gets his own hat. That is, G i D 1 if the ith man gets his own hat, and<br />

G i D 0 otherwise. The number of men that get their own hat is then the sum of<br />

these indicator random variables:<br />

G D G 1 C G 2 C C G n : (19.9)<br />

These indicator variables are not mutually independent. For example, if n 1 men<br />

all get their own hats, then the last man is certain to receive his own hat. But again,<br />

we don’t need to worry about this dependence, since linearity holds regardless.<br />

Since G i is an indicator random variable, we know from Lemma 19.4.2 that<br />

ExŒG i D PrŒG i D 1 D 1=n: (19.10)<br />

By Linearity of Expectation and equation (19.9), this means that<br />

ExŒG D ExŒG 1 C G 2 C C G n <br />

D ExŒG 1 C ExŒG 2 C C ExŒG n <br />

n<br />

‚ …„ ƒ<br />

1<br />

D<br />

n C 1 n C C 1 n<br />

D 1:<br />

So even though we don’t know much about how hats are scrambled, we’ve figured<br />

out that on average, just one man gets his own hat back, regardless of the number<br />

of men with hats!<br />

More generally, Linearity of Expectation provides a very good method <strong>for</strong> computing<br />

the expected number of events that will happen.<br />

Theorem 19.5.4. Given any collection of events A 1 ; A 2 ; : : : ; A n , the expected<br />

number of events that will occur is<br />

nX<br />

PrŒA i :<br />

iD1<br />

For example, A i could be the event that the ith man gets the right hat back. But<br />

in general, it could be any subset of the sample space, and we are asking <strong>for</strong> the<br />

expected number of events that will contain a random sample point.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!