06.06.2013 Views

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

1.1 Some Important Probability Facts 49<br />

Pro<strong>of</strong>. (Theorem 1.17)<br />

We are given the finite scalar moments ν0, ν1, . . . (about any origin) <strong>of</strong> some<br />

probability distribution, and the condition that<br />

∞<br />

j=0<br />

νjt j<br />

j!<br />

converges for some real nonzero t. We want to show that the moments uniquely<br />

determine the distribution. We will do this by showing that the moments<br />

uniquely determine the CF.<br />

Because the moments exist, the characteristic function ϕ(t) is continuous<br />

and its derivatives exist at t = 0. We have for t in a neighborhood <strong>of</strong> 0,<br />

ϕ(t) =<br />

r<br />

j=0<br />

(it) j µj<br />

j! + Rr, (1.100)<br />

where |Rr| < νr+1|t| r+1 /(r + 1)!.<br />

Now because νjtj /j! converges, νjtj /j! goes to 0 and hence the right<br />

hand side <strong>of</strong> equation (1.100) is the infinite series ∞ (it)<br />

j=0<br />

j µj<br />

if it con-<br />

j!<br />

verges. This series does indeed converge because it is dominated termwise<br />

by νjtj /j! which converges. Thus, ϕ(t) is uniquely determined within a<br />

neighborhood <strong>of</strong> t = 0. This is not sufficient, however, to say that ϕ(t) is<br />

uniquely determined a.e.<br />

We must extend the region <strong>of</strong> convergence to IR. We do this by analytic<br />

continuation. Let t0 be arbitrary, and consider a neighborhood <strong>of</strong> t = t0.<br />

Analogous to equation (1.100), we have<br />

ϕ(t) =<br />

r<br />

j=0<br />

ij (t − t0) j ∞<br />

j!<br />

−∞<br />

x j e it0x dF + Rr.<br />

Now, the modulus <strong>of</strong> the coefficient <strong>of</strong> (t − t0) j /j! is less than or equal to νj;<br />

hence, in a neighborhood <strong>of</strong> t0, ϕ(t) can be represented as a convergent Taylor<br />

series. But t0 was arbitrary, and so ϕ(t) can be extended through any finite<br />

interval by taking it as the convergent series. Hence, ϕ(t) is uniquely defined<br />

over IR in terms <strong>of</strong> the moments; and therefore the distribution function is<br />

uniquely determined.<br />

Characteristic Functions for Functions <strong>of</strong> Random Variables;<br />

Joint and Marginal Distributions<br />

If X ∈ IR d is a random variable and for a Borel function g, g(X) ∈ IR m , the<br />

characteristic function <strong>of</strong> g(X) is<br />

ϕg(X)(t) = EX<br />

<strong>Theory</strong> <strong>of</strong> <strong>Statistics</strong> c○2000–2013 James E. Gentle<br />

<br />

e itT g(X) <br />

, t ∈ IR m , (1.101)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!