17.11.2012 Views

Numerical recipes

Numerical recipes

Numerical recipes

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

15.4 General Linear Least Squares 673<br />

a vector of length M.<br />

The equations (15.4.6) or (15.4.7) are called the normal equations of the leastsquares<br />

problem. They can be solved for the vector of parameters a by the standard<br />

methods of Chapter 2, notably LU decomposition and backsubstitution, Choleksy<br />

decomposition, or Gauss-Jordan elimination. In matrix form, the normal equations<br />

can be written as either<br />

[α] · a =[β] or as<br />

The inverse matrix Cjk ≡ [α] −1<br />

jk<br />

� A T · A � · a = A T · b (15.4.10)<br />

is closely related to the probable (or, more<br />

precisely, standard) uncertainties of the estimated parameters a. To estimate these<br />

uncertainties, consider that<br />

aj =<br />

M�<br />

[α] −1<br />

k=1<br />

M�<br />

jk βk = Cjk<br />

k=1<br />

� N�<br />

i=1<br />

yiXk(xi)<br />

σ2 �<br />

i<br />

(15.4.11)<br />

and that the variance associated with the estimate aj can be found as in (15.2.7) from<br />

σ 2 (aj) =<br />

N�<br />

i=1<br />

Note that αjk is independent of yi, so that<br />

Consequently, we find that<br />

σ 2 (aj) =<br />

M�<br />

M� ∂aj<br />

=<br />

∂yi<br />

M�<br />

k=1 l=1<br />

CjkCjl<br />

σ 2 i<br />

� ∂aj<br />

∂yi<br />

� 2<br />

CjkXk(xi)/σ<br />

k=1<br />

2 i<br />

� N�<br />

i=1<br />

Xk(xi)Xl(xi)<br />

σ2 �<br />

i<br />

(15.4.12)<br />

(15.4.13)<br />

(15.4.14)<br />

The final term in brackets is just the matrix [α]. Since this is the matrix inverse of<br />

[C], (15.4.14) reduces immediately to<br />

σ 2 (aj) =Cjj<br />

(15.4.15)<br />

In other words, the diagonal elements of [C] are the variances (squared<br />

uncertainties) of the fitted parameters a. It should not surprise you to learn that the<br />

off-diagonal elements Cjk are the covariances between aj and ak (cf. 15.2.10); but<br />

we shall defer discussion of these to §15.6.<br />

We will now give a routine that implements the above formulas for the general<br />

linear least-squares problem, by the method of normal equations. Since we wish to<br />

compute not only the solution vector a but also the covariance matrix [C], it is most<br />

convenient to use Gauss-Jordan elimination (routine gaussj of §2.1) to perform the<br />

linear algebra. The operation count, in this application, is no larger than that for LU<br />

decomposition. If you have no need for the covariance matrix, however, you can<br />

save a factor of 3 on the linear algebra by switching to LU decomposition, without<br />

Sample page from NUMERICAL RECIPES IN C: THE ART OF SCIENTIFIC COMPUTING (ISBN 0-521-43108-5)<br />

Copyright (C) 1988-1992 by Cambridge University Press. Programs Copyright (C) 1988-1992 by <strong>Numerical</strong> Recipes Software.<br />

Permission is granted for internet users to make one paper copy for their own personal use. Further reproduction, or any copying of machinereadable<br />

files (including this one) to any server computer, is strictly prohibited. To order <strong>Numerical</strong> Recipes books or CDROMs, visit website<br />

http://www.nr.com or call 1-800-872-7423 (North America only), or send email to directcustserv@cambridge.org (outside North America).

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!