06.06.2013 Views

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

2.11 The Family <strong>of</strong> Normal Distributions 191<br />

theorem. This form <strong>of</strong> the theorem and various preliminary forms leading up<br />

to it are proved beginning on page 426.<br />

From the family <strong>of</strong> central chi-squared distributions together with an independent<br />

normal family, we get the family <strong>of</strong> t distributions (central or noncentral,<br />

depending on the mean <strong>of</strong> the normal). From the family <strong>of</strong> chi-squared<br />

distributions (central or noncentral) we get the family <strong>of</strong> F distributions (central,<br />

or singly or doubly noncentral; see Example 1.15 on page 59 for the<br />

central distributions).<br />

The expectations <strong>of</strong> reciprocals <strong>of</strong> normal random variables have interesting<br />

properties. First <strong>of</strong> all, we see that for X ∼ N(0, 1), E(1/X) does not exist.<br />

Now, for X ∼ Nd(0, I) consider<br />

<br />

1<br />

E<br />

X2 <br />

. (2.40)<br />

2<br />

For d ≤ 2, this expectation is infinite (exercise). For d ≥ 3, however, this<br />

expectation is finite (exercise).<br />

2.11.3 Characterizations <strong>of</strong> the Normal Family <strong>of</strong> Distributions<br />

A simple characterization <strong>of</strong> a normal distribution was proven by Cramér in<br />

1936:<br />

Theorem 2.4<br />

Let X1 and X2 be independent random variables. Then X1 and X2 have normal<br />

distributions if and only if their sum X1 + X2 has a normal distribution.<br />

Pro<strong>of</strong>. ***fix<br />

The independence <strong>of</strong> certain functions <strong>of</strong> random variables imply that<br />

those random variables have a normal distribution; that is, the independence<br />

<strong>of</strong> certain functions <strong>of</strong> random variables characterize the normal distribution.<br />

Theorem 2.5 (Bernstein’s theorem)<br />

Let X1 and X2 be iid random variables with nondegenerate distributions, and<br />

let Y1 = X1 + X2 and Y2 = X1 − X2. If Y1 and Y2 are also independent then<br />

X1 and X2 have normal distributions.<br />

Pro<strong>of</strong>. ***fix<br />

An extension <strong>of</strong> Bernstein’s theorem is the Darmois theorem, also called<br />

the Darmois-Skitovich theorem.<br />

Theorem 2.6 (Darmois theorem)<br />

Let X1, . . ., Xn be iid random variables with nondegenerate distributions, and<br />

let<br />

n<br />

Y1 =<br />

and<br />

i=1<br />

biXi<br />

<strong>Theory</strong> <strong>of</strong> <strong>Statistics</strong> c○2000–2013 James E. Gentle

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!