27.06.2013 Views

Evolution and Optimum Seeking

Evolution and Optimum Seeking

Evolution and Optimum Seeking

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

70 Hill climbing Strategies<br />

with the correction factor<br />

(k) = rF (x (k) ) T rF (x (k) )<br />

rF (x (k;1) ) T rF (x (k;1) )<br />

For a quadratic objective function with a positive de nite Hessian matrix, conjugate<br />

directions are generated in this way <strong>and</strong> the minimum is found with n line searches. Since<br />

at any time only the last direction needs to be stored, the storage requirement increases<br />

linearly with the number of variables. This often signi es a great advantage over other<br />

strategies. In the general, non-linear, non-quadratic case more than n iterations must be<br />

carried out, for which the method of Fletcher <strong>and</strong> Reeves must be modi ed. Continued<br />

application of the recursion formula (Equation (3.25)) can lead to linear dependence of<br />

the search directions. For this reason it seems necessary to forget from time to time the<br />

accumulated information <strong>and</strong> to start afresh with the simple gradient direction (Crowder<br />

<strong>and</strong> Wolfe, 1972). Various suggestions have been made for the frequency of this restart<br />

rule (Fletcher, 1972a). Absolute reliability of convergence in the general case is still not<br />

guaranteed by this approach. If the Hessian matrix of second partial derivatives has<br />

points of singularity, then the conjugate gradient strategy can fail. The exactness of<br />

the line searches also has an important e ect on the convergence rate (Kawamura <strong>and</strong><br />

Volz, 1973). Polak (1971) de nes conditions under which the method of Fletcher <strong>and</strong><br />

Reeves achieves greater than linear convergence. Fletcher (1972c) himself has written a<br />

FORTRAN program.<br />

Other conjugate gradient methods have been proposed by Powell (1962), Polak <strong>and</strong><br />

Ribiere (1969) (see also Klessig <strong>and</strong> Polak, 1972), Hestenes (1969), <strong>and</strong> Zoutendijk (1970).<br />

Schley (1968) has published a complete FORTRAN program. Conjugate directions are<br />

also produced by theprojected gradient methods (Myers, 1968 Pearson, 1969 Sorenson,<br />

1969 Cornick <strong>and</strong> Michel, 1972) <strong>and</strong> the memory gradient methods (Miele <strong>and</strong> Cantrell,<br />

1969, 1970 see also Cantrell, 1969 Cragg <strong>and</strong> Levy, 1969 Miele, 1969 Miele, Huang,<br />

<strong>and</strong> Heidemann, 1969 Miele, Levy, <strong>and</strong> Cragg, 1971 Miele, Tietze, <strong>and</strong> Levy, 1972<br />

Miele et al., 1974). Relevant theoretical investigations have been made by, among others,<br />

Greenstadt (1967a), Daniel (1967a, 1970, 1973), Huang (1970), Beale (1972), <strong>and</strong> Cohen<br />

(1972).<br />

Conjugate gradient methods are encountered especially frequently in the elds of<br />

functional optimization <strong>and</strong> optimal control problems (Daniel, 1967b, 1971 Pagurek <strong>and</strong><br />

Woodside, 1968 Nenonen <strong>and</strong> Pagurek, 1969 Roberts <strong>and</strong> Davis, 1969 Polyak, 1969<br />

Lasdon, 1970 Kelley <strong>and</strong> Speyer, 1970 Kelley <strong>and</strong> Myers, 1971 Speyer et al., 1971<br />

Kammerer <strong>and</strong> Nashed, 1972 Szego <strong>and</strong> Treccani, 1972 Polak, 1972 McCormick <strong>and</strong><br />

Ritter, 1974). Variable metric strategies are also sometimes classi ed as conjugate gradient<br />

procedures, but more usually as quasi-Newton methods. For quadratic objective<br />

functions they generate the same sequence of points as the Fletcher-Reeves strategy <strong>and</strong><br />

its modi cations (Myers, 1968 Huang, 1970). In the non-quadratic case, however, the<br />

search directions are di erent. With the variable metric, but not with conjugate directions,<br />

Newton directions are approximated.<br />

For many practical problems it is very di cult if not impossible to specify the partial<br />

derivatives as functions. The sensitivity of most conjugate gradient methods to imprecise

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!