27.06.2013 Views

Evolution and Optimum Seeking

Evolution and Optimum Seeking

Evolution and Optimum Seeking

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

One Dimensional Strategies 35<br />

It is inconvenient when dealing with minimization problems that the derivatives of<br />

the function are required. If the slopes are obtained from function values by a di erence<br />

method, di culties can arise from the nite accuracy of such a process. For this reason<br />

Brent (1973) combines regula falsi iteration with division according to the golden section.<br />

Further variations can be found in Schmidt <strong>and</strong> Trinkaus (1966), Dowell <strong>and</strong> Jarratt<br />

(1972), King (1973), <strong>and</strong> Anderson <strong>and</strong> Bjorck (1973).<br />

3.1.2.3.2 Newton-Raphson Iteration. Newton's interpolation formula for improving<br />

an approximate solution x (k) to the equation F (x) = 0 (see for example Madsen,<br />

1973)<br />

x (k+1) = x (k) ; F (x(k) )<br />

Fx(x (k) )<br />

uses only one argument value, but requires the value of the derivative of the function<br />

as well as the function itself. If F (x) is linear in x, the zero is correctly predicted here,<br />

otherwise an improved approximation is obtained at best, <strong>and</strong> the process must be repeated.<br />

Like regula falsi, Newton's recursion formula can also be applied to determining<br />

Fx(x) = 0, with of course the reservations already stated. The so-called Newton-Raphson<br />

rule is then<br />

x (k+1) = x (k) ; Fx(x (k) )<br />

Fxx(x (k) )<br />

(3.15)<br />

If F (x) is not quadratic, the necessary number of iterations must be made until a termination<br />

criterion is satis ed. Dixon (1972a) for example uses the condition jx (k+1) ; x (k) j

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!