27.06.2013 Views

Evolution and Optimum Seeking

Evolution and Optimum Seeking

Evolution and Optimum Seeking

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

116 <strong>Evolution</strong> Strategies for Numerical Optimization<br />

normally distributed r<strong>and</strong>om numbers with the expectation values zero <strong>and</strong> the variances<br />

unity. The formulae are<br />

Z<br />

<strong>and</strong><br />

0 q<br />

1 = ;2lnY1 sin (2 Y2)<br />

Z 0 q (5.8)<br />

2 = ;2lnY1 cos (2 Y2)<br />

where the Yi are the uniformly distributed <strong>and</strong> the Z0 i (0 1)-normally distributed r<strong>and</strong>om<br />

numbers respectively. Toobtain a distribution with a variance di erent from unity, the<br />

Z0 i must simply be multiplied by the desired st<strong>and</strong>ard deviation i (the \step length"):<br />

Zi = i Z 0 i<br />

The transformation rules are contained in a function procedure separate from the actual<br />

subroutine. To make use of both Equations (5.8) a switch withtwo settings is de ned,<br />

the condition of which must be preset in the subroutine once <strong>and</strong> for all. In spite of<br />

Neave's (1973) objection to the use of these rules with uniformly distributed r<strong>and</strong>om<br />

numbers that have been generated by amultiplicative congruence method, no signi cant<br />

di erences could be observed in the behavior of the evolution strategy when other r<strong>and</strong>om<br />

generators were used. On the other h<strong>and</strong> the trapezium method of Ahrens <strong>and</strong> Dieter<br />

(1972) is considerably faster.<br />

Most algorithms for parameter optimization include a second termination rule, independent<br />

of the actual convergence criterion. They end the search after no more than a<br />

speci ed number of iterations, in order to avoid an in nite series of iterations in case the<br />

convergence criterion should fail. Such a rule is e ectively a bound on the computation<br />

time. The program libraries of computers usually contain a procedure with which the<br />

CPU time used by the program can be determined. Thus instead of giving a maximum<br />

number of iterations one could specify a maximum computation time as a termination<br />

criterion. In the present program the latter option is adopted. After every n iterations<br />

the elapsed CPU time is checked. As soon as the limit is reached the search ends <strong>and</strong><br />

output of the results can be initiated from the main program.<br />

The 1=5 success rule assumes that there is always some combination of variances<br />

i > 0 with which, on average, at least one improvement can be expected within ve<br />

mutations. In Figure 5.3 two contour diagrams are shown for which the above condition<br />

cannot always be met. At some points the probability of a success cannot exceed 1=5 : for<br />

example, at points where the objective function has discontinuous rst partial derivatives<br />

or at the edge of the allowed region. Especially in the latter case, the selection principle<br />

progressively forces the sequence of iteration points closer up to the boundary <strong>and</strong> the<br />

step lengths are continuously reduced in size, without the optimum being approached<br />

with comparable accuracy.<br />

Even in the corridor model (Problem 3.8 of Appendix A, Sect. A.3) di culties can<br />

arise. In this case the rate of progress <strong>and</strong> probability of success depend on the current<br />

position relative to the edges of the corridor. Whereas the maximum probability of success<br />

in the middle of the corridor is 1=2, at the corners it is only 2 ;n . If one happens to be in the<br />

neighborhood of the edge of the corridor for several mutations, the probability of success

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!