22.01.2014 Views

download searchable PDF of Circuit Design book - IEEE Global ...

download searchable PDF of Circuit Design book - IEEE Global ...

download searchable PDF of Circuit Design book - IEEE Global ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

144 Gradient Optimization<br />

5.4.1. Summary <strong>of</strong>Fletcher-Reeves Strutegy. The unconstrained, nonlinear<br />

programming problem is:<br />

minQ(x)=F(X I ,X 2 , .•• ,X N ),<br />

,<br />

(5.76)<br />

where x is a vector composed <strong>of</strong> N variables. The process is easily visualized<br />

by inspection <strong>of</strong> Figure 5.3. This objective function and its gradient VQ must<br />

be added to the BASIC language computer code provided. The gradient is:<br />

aF aF aF )T<br />

VQ=g(x)= (-a'-a""'-a- . (5.77)<br />

XI x 2 x N<br />

The gradient may be described analytically, if available, or found numerically<br />

by 0.01% finite differences. The user should consider an "awful warning"<br />

concerning excessive numerical noise, such as might occur if a named variable<br />

might inadvertently be declared an integer as opposed to a floating-point<br />

number. The resulting discontinuous behavior <strong>of</strong> the objective function will<br />

have a disastrous effect on partial derivatives obtained by finite differences.<br />

Almost all gradient optimizer programs will appear unacceptably sluggish<br />

under these circumstances.<br />

Given an initial starting vector, xo, a sequence <strong>of</strong> linear (line) searches,<br />

(5.78)<br />

is performed in a calculated direction s in the variable a;. Each search<br />

terminates when a minimum is approximated so that the directi.onal derivative<br />

is nearly zero:<br />

(5.79)<br />

The comprehensive procedure to accomplish reasonably accurate line searches<br />

on arbitrary functions <strong>of</strong> a; was discussed in Section 5.3.<br />

The first linear search direction is the negative gradient (steepest descent),<br />

i.e., with {31 =0 in the direction formula<br />

i= 1,2, ...,N. (5.80)<br />

This describes a sequence <strong>of</strong> directions calculated after estimating each linear<br />

search minimum. The new search direction is simply the negative gradient<br />

plus a fraction <strong>of</strong> the just-used search direction. The fraction is:<br />

{31 =0;<br />

where the squared-norm notation<br />

Ilg i ll 2<br />

{3;= IIgH II2 ;<br />

i=2,3, ...,N,<br />

(5.81 )<br />

(5.82)<br />

defines an inner product. It is seen from (5.80) that certain curvature information<br />

is accumulated for influencing the choice <strong>of</strong> subsequent search directions.<br />

This strategy was developed on the assumption <strong>of</strong> quadratic functions where

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!