27.06.2013 Views

Evolution and Optimum Seeking

Evolution and Optimum Seeking

Evolution and Optimum Seeking

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

R<strong>and</strong>om Strategies 97<br />

maintained at least approximately during the course of the iterations. At the starting<br />

point x (0) two r<strong>and</strong>om changes are made with step lengths s (0) <strong>and</strong> s (0) (1 + a), where<br />

0 < a 1. If both samples are successful, for the next iteration s (1) = s (0) (1 + a) is<br />

taken, i.e., the greater value. If only one sample yields an improvement in the objective<br />

function, its step length is taken nally if no success is scored, s (1) remains equal to s (0) .<br />

A reduction in s is only made if several consecutive trials are unsuccessful. This is<br />

also the procedure of Maybach (1966). This adjustment to the local conditions assists the<br />

strategy in achieving high convergence rates but reduces the chances of locating global<br />

optima among several local ones. For this reason a sample with a signi cantly larger step<br />

length (a > 1) should be included from time to time. Numerical tests show that the<br />

computation cost, or number of trials, actually only increases linearly with the number<br />

of variables. Schumer <strong>and</strong> Steiglitz have tested this using the model functions F3 <strong>and</strong><br />

F4(x) =<br />

A comparison with a Newton-Raphson strategy, inwhich the partial rst <strong>and</strong> second<br />

derivatives are determined numerically <strong>and</strong> the cost increases as O(n 2 ), favors the r<strong>and</strong>om<br />

method when n>78 for F3 <strong>and</strong> when n>2 for F4. For the second, biquadratic model<br />

function, Nelder <strong>and</strong> Mead (1965) state that the numberoftrialsorfunctionevaluations<br />

in their simplex strategy grows as O(n 2:11 ), so that the sequential r<strong>and</strong>om method is<br />

superior from n>10. White <strong>and</strong> Day (1971) report numerical tests in which the cost in<br />

iterations with Schumer's strategy increases more sharply than linearly with n, whereas<br />

a modi cation by White (1970) shows exact linear dependence. A comparison with the<br />

strategy of Fletcher <strong>and</strong> Powell (1963) favors the latter, especially for truly quadratic<br />

functions.<br />

Rechenberg (1973), with an n-dimensional normal distribution (see Chap. 5, Sect. 5.1),<br />

reaches almost the same theoretical results as Schumer for the circumferential distribution,<br />

if one notes that the overall step length<br />

tot =<br />

vu<br />

u<br />

t nX<br />

i=1<br />

nX<br />

i=1<br />

x 4<br />

i<br />

2<br />

i = p n<br />

for equal variances 2<br />

i = 2 in each r<strong>and</strong>om component zi is proportional to the square<br />

root of the number of variables. The reason for this lies in the property of Euclidean<br />

space that, as the number of dimensions increases, the volume of a hypersphere becomes<br />

concentrated more <strong>and</strong> more in the boundary region near the surface. Rechenberg's adaptation<br />

rule is founded on the relation between optimal variance <strong>and</strong> probability of success<br />

derived from two essentially di erent models of the objective function. The adaptation<br />

rule which is thereby formulated makes the frequency <strong>and</strong> size of the increments respectively<br />

dependent on the number of variables <strong>and</strong> independent of the structure of the<br />

objective function. This will be discussed in more detail in Chapter 5, Section 5.1.<br />

Convergence proofs for the sequential r<strong>and</strong>om strategy have been given by Matyas<br />

(1965, 1967) <strong>and</strong> Rechenberg (1973) only for the case of constant variance 2 . Gurin<br />

(1966) has proved convergence also for stochastically perturbed objective functions. The

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!