15.12.2012 Views

scipy tutorial - Baustatik-Info-Server

scipy tutorial - Baustatik-Info-Server

scipy tutorial - Baustatik-Info-Server

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

SciPy Reference Guide, Release 0.8.dev<br />

The first four algorithms are unconstrained minimization algorithms (fmin: Nelder-Mead simplex, fmin_bfgs:<br />

BFGS, fmin_ncg: Newton Conjugate Gradient, and leastsq: Levenburg-Marquardt). The last algorithm actually<br />

finds the roots of a general function of possibly many variables. It is included in the optimization package because at<br />

the (non-boundary) extreme points of a function, the gradient is equal to zero.<br />

1.5.1 Nelder-Mead Simplex algorithm (fmin)<br />

The simplex algorithm is probably the simplest way to minimize a fairly well-behaved function. The simplex algorithm<br />

requires only function evaluations and is a good choice for simple minimization problems. However, because it does<br />

not use any gradient evaluations, it may take longer to find the minimum. To demonstrate the minimization function<br />

consider the problem of minimizing the Rosenbrock function of N variables:<br />

f (x) =<br />

N−1 �<br />

i=1<br />

100 � xi − x 2 �2 i−1 + (1 − xi−1) 2 .<br />

The minimum value of this function is 0 which is achieved when xi = 1. This minimum can be found using the fmin<br />

routine as shown in the example below:<br />

>>> from <strong>scipy</strong>.optimize import fmin<br />

>>> def rosen(x):<br />

... """The Rosenbrock function"""<br />

... return sum(100.0*(x[1:]-x[:-1]**2.0)**2.0 + (1-x[:-1])**2.0)<br />

>>> x0 = [1.3, 0.7, 0.8, 1.9, 1.2]<br />

>>> xopt = fmin(rosen, x0, xtol=1e-8)<br />

Optimization terminated successfully.<br />

Current function value: 0.000000<br />

Iterations: 339<br />

Function evaluations: 571<br />

>>> print xopt<br />

[ 1. 1. 1. 1. 1.]<br />

Another optimization algorithm that needs only function calls to find the minimum is Powell’s method available as<br />

fmin_powell.<br />

1.5.2 Broyden-Fletcher-Goldfarb-Shanno algorithm (fmin_bfgs)<br />

In order to converge more quickly to the solution, this routine uses the gradient of the objective function. If the gradient<br />

is not given by the user, then it is estimated using first-differences. The Broyden-Fletcher-Goldfarb-Shanno (BFGS)<br />

method typically requires fewer function calls than the simplex algorithm even when the gradient must be estimated.<br />

To demonstrate this algorithm, the Rosenbrock function is again used. The gradient of the Rosenbrock function is the<br />

vector:<br />

N�<br />

∂f<br />

= 200<br />

∂xj i=1<br />

� xi − x 2 �<br />

i−1 (δi,j − 2xi−1δi−1,j) − 2 (1 − xi−1) δi−1,j.<br />

�<br />

− 400xj<br />

= 200 � xj − x 2 j−1<br />

This expression is valid for the interior derivatives. Special cases are<br />

∂f<br />

∂x0<br />

∂f<br />

∂xN−1<br />

�<br />

xj+1 − x 2� j − 2 (1 − xj) .<br />

�<br />

= −400x0 x1 − x 2� 0 − 2 (1 − x0) ,<br />

= 200 � xN−1 − x 2 �<br />

N−2 .<br />

16 Chapter 1. SciPy Tutorial

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!