22.01.2014 Views

download searchable PDF of Circuit Design book - IEEE Global ...

download searchable PDF of Circuit Design book - IEEE Global ...

download searchable PDF of Circuit Design book - IEEE Global ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

124 Gradient Optimization<br />

function is known to be quadratic, as (5.9) for example, then y'''(x) in (5.28)<br />

will be zero anyhow. The reader should understand this single-variable case<br />

before proceeding. The multivariable case is formulated in exactly the same<br />

way.<br />

The multivariable function in (5.6) can be expanded by a Taylor series<br />

about point p, where the displacement from p is<br />

Ax=x-p. \ (5.29)<br />

The vector p might be the location <strong>of</strong> the blind man standing at p=(I0, 10)T in<br />

Figure 5.5. Then the Taylor series for a real function <strong>of</strong> the vector x is<br />

F(Ax) = F(p) + g(p)TAx + !AxTH(p)Ax + h.o.t., (5.30)<br />

where higher-order terms (h.o.t) are presumed to be insignificant. Matrix H is<br />

known as the Hessian:<br />

H~<br />

a'F<br />

ax;<br />

il'F<br />

ilx, ax,<br />

a'F<br />

ax,ax,<br />

il'F<br />

axj<br />

(5.31 )<br />

By differentiating (5.13), it is seen that H=A for a quadratic function. It is<br />

thus possible to expand the quadratic sample function in (5.7) about an<br />

arbitrary point, say p= (10, IO)T. The result in terms <strong>of</strong> (5.29) is<br />

F(Ax) =292+ (100,28)Ax+ !Ax T [ _ ~~<br />

-1~ ]AX,<br />

(5.32)<br />

where Ax, =x, - 10 and Ax, = x, - 10. This describes the function in Figure 5.5<br />

with respect to point p. For quadratic functions, this is the same as shifting the<br />

origin; the reader should replace x, by x, + 10 and x, by x, + 10 in (5.8) and<br />

confirm that it is equivalent to (5.32).<br />

Analogous to (5.13), the gradient <strong>of</strong> (5.30) is<br />

VF(Ax)= g(p)+ H(p)Ax. (5.33)<br />

So the blind man on a quadratic mountain at point p (Figure 5.5) could<br />

calculate where the minimum should be with respect to that point. In a<br />

manner similar to (5.19), the step to the minimum is<br />

(5.34)<br />

Note that the second derivatives in H must be known. For the central sample<br />

function used as an example, the step from point p=(IO, IOl to the minimum<br />

IS<br />

Ax=-H-'g=_l(26 10)(-100)=(-5).<br />

576 10 26 -28 -3<br />

See Figure 5.5 to confirm this step.<br />

(5.35)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!