27.06.2013 Views

Evolution and Optimum Seeking

Evolution and Optimum Seeking

Evolution and Optimum Seeking

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Multidimensional Strategies 53<br />

Numbering Iteration Test index/ Variable Step Remarks<br />

index iteration values lengths<br />

k nl + i x 1 x 2 s 1 s 2<br />

(0) 0 0 0 9 2 2 starting point<br />

(1) 0 1 2 9 2 - success<br />

(2) 0 2 2 11 - 2 failure<br />

(3) 0 3 8 9 6 - failure<br />

(4) 0 4 2 8 - ;1 success<br />

(5) 0 5 ;1 8 ;3 - failure<br />

(6) 0 6 2 5 - ;3 failure<br />

(4) 1 0 2 8 2 2 transformation<br />

<strong>and</strong> orthogonalization<br />

(7) 1 1 3.8 7.1 2 - success<br />

(8) 1 2 2.9 5.3 - 2 success<br />

(9) 1 3 8.3 2.6 6 - success<br />

(10) 1 4 5.6 ;2.7 - 6 failure<br />

(11) 1 5 24.4 ;5.4 18 - failure<br />

(9) 2 0 8.3 2.6 2 2 transformation<br />

<strong>and</strong> orthogonalization<br />

In Figure 3.7, including the following table, a few iterations of the Rosenbrock strategy<br />

for n = 2 are represented geometrically. At the starting point x (00) the search directions<br />

are the same as the unit vectors. After three runs through (6 trials), the trial steps in each<br />

direction have led to a success followed by a failure. At the best condition thus attained,<br />

are generated. Five further trials<br />

(4) at x (04) = x (10) , new direction vectors v (1)<br />

1<br />

<strong>and</strong> v (1)<br />

2<br />

lead to the best point, (9) at x (13) = x (20) , of the second iteration, at which a new choice<br />

of directions is again made. The complete sequence of steps can be followed, if desired,<br />

with the help of the accompanying table.<br />

Numerical experiments show that within a few iterations the rotating coordinates<br />

become oriented such that one of the axes points along the gradient direction. The<br />

strategy thus allows sharp valleys in the topology of the objective function to be followed.<br />

Like the method of Hooke <strong>and</strong> Jeeves, Rosenbrock's procedure needs no information about<br />

partial derivatives <strong>and</strong> uses no line search method for exact location of relative minima.<br />

This makes it very robust. It has, however, one disadvantage compared to the direct<br />

pattern search: The orthogonalization procedure of Gram <strong>and</strong> Schmidt is very costly.<br />

It requires storage space of order O(n 2 ) for the matrices A = faijg <strong>and</strong> V = fvijg,<br />

<strong>and</strong> the number of computational operations even increases with O(n 3 ). At least in<br />

cases where the objective function call costs relatively little, the computation time for<br />

the orthogonalization with many variables becomes highly signi cant. Besides this, the<br />

number of parameters is in any case limited by the high storage space requirement.<br />

If there are constraints, care must be taken to ensure that the starting point is inside<br />

the allowed region <strong>and</strong> su ciently far from the boundaries. Examples of the application of

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!