08.11.2012 Views

multivariate production systems optimization - Stanford University

multivariate production systems optimization - Stanford University

multivariate production systems optimization - Stanford University

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

If used properly, Newton’s Method offers the fastest rate of convergence available.<br />

However, this robust rate of convergence is achieved at the expense of the stability of the<br />

algorithm. Some of the shortcomings of the method are<br />

• A good initial starting point is required for convergence. A poor estimate<br />

will result in divergence.<br />

• If Newton’s Method does converge, it converges to the nearest stationary<br />

point. No guarantee can be made that this is the desired stationary point.<br />

It could be a local minimum, a maximum, or a saddlepoint.<br />

• The Hessian matrix is subject to numerical instabilities during the matrix<br />

solution.<br />

• If the analytic derivatives are not available, the derivatives necessary to<br />

construct the gradient and Hessian will have to be evaluated by very<br />

costly finite difference approximations.<br />

6.2 Modifications to Newton’s Method<br />

In practice, the first modification made to Newton’s Method is to incorporate a onedimensional<br />

line search to obtain an improved step-length. Based largely on the credo of<br />

“Look before you leap,” the line search is performed on the objective function in the<br />

direction indicated by p. The line search locates, or approximates the location of, the<br />

minimum in this single dimension. The distance to the minimum in this single dimension is<br />

taken as the step-length ρ. This minimum is taken as the new estimate of x by taking a step<br />

of length ρ in the p direction (Equation 6.13).<br />

The next level of modifications made to Newton’s Method involve altering the<br />

eigenvalues of the Hessian. The ideal modification will converge rapidly, with speed<br />

approaching that of Newton’s Method, and will always be descending downhill towards<br />

the minimum. A direction of descent is guaranteed if the Hessian is positive definite,<br />

meaning that all of the eigenvalues are positive. Three methods are discussed that modify<br />

Newton’s Method to produce a positive definite Hessian. These are the method of steepest<br />

descent, the Marquardt (1963) modification, and the Greenstadt (1967) modification. The<br />

method of steepest descent is included primarily for illustrative purposes and is not<br />

recommended for practical usage.<br />

68

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!