12.07.2015 Views

Stat 5101 Lecture Notes - School of Statistics

Stat 5101 Lecture Notes - School of Statistics

Stat 5101 Lecture Notes - School of Statistics

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

4.4. THE POISSON PROCESS 123measure in one dimension is length. Then the c. d. f. <strong>of</strong> X is given byF (x) =P(X≤x)=P(there is at least one arrival in (a, a + x))= P (Y ≥ 1)=1−P(Y =0)=1−e −λxDifferentiating gives the density (4.10).The assertion about the conditional and unconditional distributions beingthe same is just the fact that the process on (−∞,a] is independent <strong>of</strong> theprocess on (a, +∞). Hence the waiting time distribution is the same whetheror not we condition on the point pattern in (−∞,a].The length <strong>of</strong> time between two consecutive arrivals is called the interarrivaltime. Theorem 4.7 also gives the distribution <strong>of</strong> the interarrival times, becauseit says the distribution is the same whether or not we condition on there beingan arrival at the time we start waiting. Finally, the theorem says an interarrivaltime is independent <strong>of</strong> any past interarrival times. Since independence is asymmetric property (X is independent <strong>of</strong> Y if and only if Y is independent <strong>of</strong>X), this means all interarrival times are independent.This means we can think <strong>of</strong> a one-dimensional Poisson process two differentways.• The number <strong>of</strong> arrivals in disjoint intervals are independent Poisson randomvariables. The number <strong>of</strong> arrivals in an interval <strong>of</strong> length t is Poi(λt).• Starting at an arbitrary point (say time zero), the waiting time to thefirst arrival is Exp(λ). Then all the successive interarrival times are alsoExp(λ). And all the interarrival times are independent <strong>of</strong> each other andthe waiting time to the first arrival.Thus if X 1 , X 2 , ... are i. i. d. Exp(λ) random variables, the times T 1 , T 2 ,... defined byn∑T n = X i (4.11)form a Poisson process on (0, ∞).Note that by the addition rule for the gamma distribution, the time <strong>of</strong> thenth arrival is the sum <strong>of</strong> n i. i. d. Gam(1,λ) random variables and hence has aGam(n, λ) distribution.These two ways <strong>of</strong> thinking give us a c. d. f. for the Gam(n, λ) distributioni=1

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!