27.06.2013 Views

Evolution and Optimum Seeking

Evolution and Optimum Seeking

Evolution and Optimum Seeking

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

172 Comparison of Direct Search Strategies for Parameter Optimization<br />

For a quadratic function with a full matrix of coe cients, just to evaluate the expression<br />

xT Axrequires O(n2 ) basic arithmetical operations. If the order of magnitude is denoted<br />

by O(nf ) then, assuming f<br />

computation time is given by:<br />

1, for all the optimization methods considered so far the<br />

T n 2+f<br />

n 3<br />

The advantage of having fewer function-independent operations in the Fletcher-Reeves<br />

method, therefore, only makes itself felt if the number of variables is small <strong>and</strong> the time<br />

for one function evaluation is short.<br />

All the variants of the basic second order strategies mentioned here can be tted, with<br />

similar assumptions, into the above scheme. Among these are (Broyden, 1972)<br />

Modi ed <strong>and</strong> quasi-Newton methods<br />

Methods of conjugate gradients <strong>and</strong> conjugate directions<br />

Variable metric strategies, with their variations using correction matrices of rank<br />

one<br />

There is no optimization method that has a cost rising with less than the third power<br />

of the number of variables. Even the indirect procedure, in which the equations for the<br />

necessary conditions for an extremum are set up <strong>and</strong> solved by conventional methods,<br />

does not a ord any basic reduction in the computational e ort. If the objective function<br />

is quadratic, a system of n simultaneous linear equations is obtained. To solve for the<br />

n unknowns the Gaussian elimination method requires 1<br />

3 n3 basic operations (multiplications<br />

<strong>and</strong> divisions). According to Zurmuhl (1965) all the other direct methods, meaning<br />

here non-iterative methods, are more costly, except in special cases. Methods involving<br />

a stepwise approach to the solution of systems of linear equations (relaxation methods)<br />

require an in nite number of iterations to reach an absolutely exact result. They converge<br />

linearly <strong>and</strong> correspond to rst order optimization strategies (single step or Gauss-Seidel<br />

methods <strong>and</strong> total step or gradient methods see Schwarz, Rutishauser, <strong>and</strong> Stiefel, 1968).<br />

Only the method of Hestenes <strong>and</strong> Stiefel (1952) converges after a nite number of calculation<br />

steps, assuming that the calculations are exact. It is a conjugate gradient method<br />

for solving systems of linear equations with a symmetrical, positive-de nite matrix of<br />

coe cients.<br />

The main concern here is with direct, i.e., derivative-free, search strategies for optimization.<br />

Finiteness of the search in the quadratic case <strong>and</strong> greater than linear convergence<br />

can only be proved for the Powell method of conjugate directions <strong>and</strong> for<br />

the Davidon-Fletcher-Powell variable metric method, which Stewart reformulated as a<br />

derivative-free quasi-Newton method. Of the coordinates strategy, at best it can be said<br />

that it converges linearly. The same holds for the simple gradient methods. There are also<br />

versions of them in which the partial derivatives are obtained numerically. Since various<br />

comparison tests have shown them to be rather ine ective in highly non-linear situations,<br />

none is considered here. No theoretically founded statements about convergence rates<br />

<strong>and</strong> Q-properties are available for the other direct strategies. The rate of progress dened<br />

by Rechenberg (1973) for the evolution strategy with adaptive step length control

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!