27.06.2013 Views

Evolution and Optimum Seeking

Evolution and Optimum Seeking

Evolution and Optimum Seeking

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

114 <strong>Evolution</strong> Strategies for Numerical Optimization<br />

Gradient methods, which seek a point withvanishing rst derivatives, frequently also<br />

apply this necessary condition for the existence of an extremum as a termination criterion.<br />

Alternatively the search can be continued until 4F = F (x (k;1) ) ; F (x (k) ), the change<br />

in the objective function value in one iteration, goes to zero or to below a prescribed<br />

limit. But this requirement can also be ful lled far from the minimum if the valley in<br />

which the deepest point issought happens to be very at in shape. In this case the<br />

step length control of the two membered evolution strategy ensures that the variances<br />

become larger, <strong>and</strong> thus the function value di erences between two successful trials also<br />

on average become larger. This is guaranteed even if the function values are equal (within<br />

computational accuracy), since a change in the variables is always then registered as a<br />

success. One thus has only to take care that 4F is summed over a number of results<br />

in order to derive a termination criterion. Just as lower bounds are de ned for the step<br />

lengths, an absolute <strong>and</strong> a relative bound can be speci ed here:<br />

Termination rule:<br />

End the search if<br />

or<br />

where<br />

<strong>and</strong><br />

1<br />

"d<br />

"c > 0<br />

1+"d > 1<br />

F (x (g;4g)<br />

E ) ; F (x (g)<br />

E ) "c<br />

h (g;4g)<br />

F (x E ) ; F (x (g)<br />

E ) i<br />

)<br />

4g 20 n<br />

jF (x (g)<br />

E )j<br />

according to the computational accuracy<br />

The condition 4g 20 n is designed to ensure that in the extreme case the st<strong>and</strong>ard<br />

deviations are reduced or increased within the test period by at least the factor<br />

(0:85) 20 ' (25) 1 , in accordance with the 1=5 success rule. This will prevent the search<br />

being terminated only because the variances are forced to change suddenly. It is clear<br />

from Equation (5.4) that the more variables are involved in the problem, the slower is<br />

the rate of progress. Hence it does not make sense to test the convergence criterion very<br />

frequently. A recommended procedure is to make a test every 20 n mutations. Only one<br />

additional function value then needs to be stored.<br />

Another reason can be adduced for linking the termination of the search to the function<br />

value changes. While every success in an optimum search means, in the end, an economic<br />

pro t, every iteration costs computer time <strong>and</strong> thus money. If the costs exceed the pro t,<br />

the optimization may well provide useful information, but it is certainly not on the whole<br />

of any economic value. Thus someone who only wishes to optimize from an economic<br />

point of view can, by a suitable choice of values for the accuracy parameters, restrain the<br />

search process as soon as it starts running into a loss.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!