27.06.2013 Views

Evolution and Optimum Seeking

Evolution and Optimum Seeking

Evolution and Optimum Seeking

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Multidimensional Strategies 57<br />

Anumerical strategy comparison by M.J. Box (1966) shows the method to be a very<br />

e ective optimization procedure, in general superior both to the Hooke <strong>and</strong> Jeeves <strong>and</strong><br />

the Rosenbrock methods. However, the tests only refer to smooth objective functions with<br />

few variables. If the number of parameters is large, the costly orthogonalization process<br />

makes its inconvenient presence felt also in the DSC strategy.<br />

Several suggestions have been made to date as to how to simplify the Gram-Schmidt<br />

procedure <strong>and</strong> to reduce its susceptibilitytonumerical rounding error (Rice, 1966 Powell,<br />

1968a Palmer, 1969 Golub <strong>and</strong> Saunders, 1970, Householder method).<br />

Palmer replaces the conditions of Equation (3.21) by:<br />

v (k+1)<br />

i<br />

=<br />

8<br />

><<br />

>:<br />

v (k)<br />

i if<br />

nP<br />

j=1<br />

nP<br />

j=1<br />

dj v (k)<br />

s<br />

nP<br />

j = d<br />

j=1<br />

2<br />

j for i =1<br />

nP<br />

di;1<br />

j=i<br />

dj v (k)<br />

j<br />

; v (k)<br />

i;1<br />

nP<br />

j=i<br />

d 2<br />

j<br />

!<br />

s<br />

nP<br />

=<br />

j=i<br />

d 2<br />

j<br />

nP<br />

j=i;1<br />

d 2<br />

j for i = 2(1)n<br />

d 2<br />

j = 0 otherwise<br />

He shows that even if no success was obtained in one of the directions v (k)<br />

i , that is di =0,<br />

the new vectors v (k+1)<br />

i<br />

v (k+1)<br />

i+1<br />

for all i = 1(1)n still span the complete parameter space, because<br />

is set equal to ;v (k)<br />

i .Thus the algorithm does not need to be restricted to directions<br />

for which di >", as happens in the algorithm with Gram-Schmidt orthogonalization.<br />

The signi cant advantage of the revised procedure lies in the fact that the number<br />

of computational operations remains only of the order O(n 2 ). The storage requirement<br />

is also somewhat less since one n n matrix as an intermediate storage area is omitted.<br />

For problems with linear constraints (equalities <strong>and</strong> inequalities) Box, Davies, <strong>and</strong> Swann<br />

(1969) recommend a modi cation of the orthogonalization procedure that works in a<br />

similar way to the method of projected gradients of Rosen (1960, 1961) (see also Davies,<br />

1968). Non-linear constraints (inequalities) can be h<strong>and</strong>led with the created response<br />

surface technique devised by Carroll (1961), which is one of the penalty function methods.<br />

Further publications on the DSC strategy, also with comparison tests, are those of<br />

Swann (1969), Davies <strong>and</strong> Swann (1969), Davies (1970), <strong>and</strong> Swann (1972). Hoshino<br />

(1971) observes that in a narrow valley the search causes zigzag movements. His remedy<br />

for this is to add a further search, again in direction v (k)<br />

1 , after each setofn line searches.<br />

With the help of two examples, for n =2<strong>and</strong>n =3,heshows the accelerating e ect of<br />

this measure.<br />

3.2.1.5 Simplex Strategy of Nelder <strong>and</strong> Mead<br />

There are a group of methods called simplex strategies that work quite di erently to<br />

the direct search methods described so far. In spite of their common name they have<br />

nothing to do with the simplex method of linear programming of Dantzig (1966). The<br />

idea (Spendley, Hext, <strong>and</strong> Himsworth, 1962) originates in an attempt to reduce, as much<br />

as possible, the number of simultaneous trials in the experimental identi cation procedure

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!