06.06.2013 Views

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

674 0 Statistical Mathematics<br />

if the derivatives exist and the series is convergent. (The class <strong>of</strong> functions<br />

for which this is the case in some region that contains x and a is said to be<br />

analytic over that region; see page 650. An important area <strong>of</strong> analysis is the<br />

study <strong>of</strong> analyticity.)<br />

In applications, the series is usually truncated, and we call the series with<br />

k + 1 terms, the kth order Taylor expansion.<br />

For a function <strong>of</strong> m variables, it is a rather complicated expression:<br />

⎛<br />

∞<br />

f(x1, . . ., xm) = ⎝ 1<br />

<br />

m<br />

(xk − ak)<br />

j!<br />

∂<br />

⎞<br />

j<br />

f(x1, . . ., xm) ⎠<br />

∂xk<br />

j=0<br />

k=1<br />

(x1,...,xm)=(a1,...,am)<br />

(0.0.82)<br />

The second order Taylor expansion for a function <strong>of</strong> an m-vector is the<br />

much simpler expression.<br />

f(x) ≈ f(a) + (x − a) T ∇f(a) + 1<br />

2 (x − a)T Hf(a)(x − a), (0.0.83)<br />

where ∇f(a) is the vector <strong>of</strong> first derivatives evaluated at a and Hf(a) is the<br />

matrix <strong>of</strong> second second derivatives (the Hessian) evaluated at a. This is the<br />

basis for Newton’s method in optimization, for example. Taylor expansions<br />

beyond the second order for vectors becomes rather messy (see the expression<br />

on the right side <strong>of</strong> the convergence expression (1.198) on page 95, for<br />

example).<br />

0.0.9.8 Mean-Value Theorem<br />

Two other useful facts from calculus are Rolle’s theorem and the mean-value<br />

theorem, which we state here without pro<strong>of</strong>. (Pro<strong>of</strong>s are available in most<br />

texts on calculus.)<br />

Theorem 0.0.18 (Rolle’s theorem)<br />

Assume the function f(x) is continuous on [a, b] and differentiable on ]a, b[. If<br />

f(a) = f(b), then there exists a point x0 with a < x0 < b such that f ′ (x0) = 0.<br />

Theorem 0.0.19 (mean-value theorem)<br />

Assume the function f(x) is continuous on [a, b] and differentiable on ]a, b[.<br />

Then there exists a point x0 with a < x0 < b such that<br />

0.0.9.9 Evaluation <strong>of</strong> Integrals<br />

f(b) − f(a) = (b − a)f ′ (x0).<br />

There are many techniques that are useful in evaluation <strong>of</strong> a definite integral.<br />

Before attempting to evaluate the integral, we should establish that the<br />

integral is finite. For example, consider the integral<br />

<strong>Theory</strong> <strong>of</strong> <strong>Statistics</strong> c○2000–2013 James E. Gentle<br />

.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!