15.01.2015 Views

Lectures on Elementary Probability

Lectures on Elementary Probability

Lectures on Elementary Probability

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

4.6. NEGATIVE BINOMIAL RANDOM VARIABLES 27<br />

Proof: This is the series<br />

∞∑<br />

E[W ] = nP [W = n] = 1 ∞∑<br />

n(1 − p) n−1 p 2 = 1 p<br />

p . (4.18)<br />

n=1<br />

This is because n(1 − p) n−1 p 2 is the probability that the sec<strong>on</strong>d success occurs<br />

<strong>on</strong> the n + 1st trial. If we grant that there is eventually going to be a sec<strong>on</strong>d<br />

success, then these probabilities should add to <strong>on</strong>e.<br />

n=1<br />

Theorem 4.6 The variance of the geometric random variable is<br />

Proof: C<strong>on</strong>sider the series<br />

∞∑<br />

E[W (W + 1)] = n(n + 1)P [W = n] = 2 p 2<br />

n=1<br />

σW 2 = 1 (1 − p). (4.19)<br />

p2 ∞ ∑<br />

n=1<br />

n(n + 1)<br />

2<br />

(1 − p) n−1 p 3 = 2 p 2 .<br />

(4.20)<br />

This is because n(n + 1)/2 (1 − p) n−1 p 3 is the probability that the third success<br />

occurs <strong>on</strong> the n + 2nd trial. If we grant that there is eventually going to be a<br />

third success, then these probabilities should add to <strong>on</strong>e. Thus<br />

σ 2 W = E[W 2 ]−E[W ] 2 = E[W 2 ]+E[W ]−E[W ]−E[W ] 2 = 2 p 2 −1 p − 1 p 2 = 1 p 2 (1−p).<br />

(4.21)<br />

These formula for the mean and expectati<strong>on</strong> of the geometric waiting time<br />

are quite important. The formula for the mean is 1/p, which is a quite intuitive<br />

result. The formula for the variance is also important. It is better to look at<br />

the standard deviati<strong>on</strong>. This is 1/p √ 1 − p. If p is well below <strong>on</strong>e, so that <strong>on</strong>e<br />

has to wait a l<strong>on</strong>g time for a success, then the standard deviati<strong>on</strong> is almost as<br />

large as the mean. This means that <strong>on</strong>e not <strong>on</strong>ly has to wait a l<strong>on</strong>g time for a<br />

success, but the variability of the length of the wait is also quite l<strong>on</strong>g. Maybe<br />

it will be short; maybe it will be several times the mean.<br />

4.6 Negative binomial random variables<br />

C<strong>on</strong>sider a Bernoulli process with parameter p > 0. Let T r = W 1 + · · · + W r<br />

be the trial <strong>on</strong> which the rth success occurs. Then the probability of the event<br />

that W = n is the probability of a pattern of r − 1 successes and n − r failures<br />

<strong>on</strong> the first n − 1 trials, followed by a success. This kind of random variable is<br />

called negative binomial. This proves the following theorem.<br />

Theorem 4.7 For a negative binomial random variable T r with parameters r<br />

and p the distributi<strong>on</strong> is given for n = r, r + 1, r + 2, . . . by<br />

( ) n − 1<br />

P [T r = n] = P [N n−1 = r − 1]P [X n = 1] = p r (1 − p) n−r . (4.22)<br />

r − 1

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!