28.04.2014 Views

18 Poisson Process

18 Poisson Process

18 Poisson Process

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

<strong>18</strong> POISSON PROCESS 197<br />

N n has independent increments for any n and so the same holds in the limit. We should note<br />

that the Heads probability does not need to be exactly λ n<br />

, instead it suffices that this probability<br />

converges to λ when multiplied by n. Similarly, we do not need all integer multiplies of 1 n<br />

, it is<br />

enough that their number in [0,t], divided by n, converges to t in probability for any fixed t.<br />

An example of a property that follows immediately is the following. Let S k be the time of<br />

the k’th (say, 3rd) event (which is a random time), and let N k (t) be the number of additional<br />

events in time t after time S k . Then N k (t) is another <strong>Poisson</strong> process, with the same rate λ,<br />

as starting to count your Heads afresh after after the k’th Heads gives you the same process<br />

as if you counted them from the beginning — we can restart a <strong>Poisson</strong> process at the time of<br />

k’th event. In fact, we can do so at any stopping time, a random time T with the property that<br />

T = t depends only on the behavior of the <strong>Poisson</strong> process up to time t (i.e., on the past but<br />

not on the future). That <strong>Poisson</strong> process, restarted at a stopping time, has the same properties<br />

as the original process started at time 0 is called the strong Markov property.<br />

As each N k is a <strong>Poisson</strong> process, N k (0) = 0, so two events in the original <strong>Poisson</strong> N(t)<br />

process do not happen at the same time.<br />

Let T 1 ,T 2 ,... be the interarrival times, where T n is the time elapsed between (n − 1)’st<br />

and n’th event. A typical example would be the times between consecutive buses arriving at a<br />

station.<br />

Proposition <strong>18</strong>.1. Distribution of interarrival times:<br />

Proof. We have<br />

T 1 ,T 2 ,... are independent and Exponential(λ).<br />

P(T 1 > t) = P(N(t) = 0) = e −λt ,<br />

which proves that T 1 is Exponential(λ). Moreover, for any s > 0 and t > 0,<br />

P(T 2 > t|T 1 = s) = P(no events in (s,s + t]|T 1 = s) = P(N(t) = 0) = e −λt ,<br />

as events in (s,s + t] are not influences by what happens in [0,s]. So T 2 is independent of T 1<br />

and Exponential(λ). Similarly we can establish that T 3 is independent of T 1 and T 2 with the<br />

same distribution, and so on.<br />

Construction by exponential interarrival times. We can use the above Proposition <strong>18</strong>.1<br />

for another construction of a <strong>Poisson</strong> process, which is very convenient for simulations. Let T 1 ,<br />

T 2 ,... be i. i. d. Exponential(λ) random variables and let S n = T 1 +...+T n be the waiting time<br />

for the n’th event. Then we define N(t) to be the largest n so that S n ≤ t.<br />

We know that ES n = n λ<br />

, but we can derive its density; the distribution is called Gamma(n,λ).<br />

We start with<br />

n−1<br />

∑<br />

P(S n > t) = P(N(t) < n) =<br />

j=0<br />

e −λt(λt)j ,<br />

j!

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!