15.12.2012 Views

scipy tutorial - Baustatik-Info-Server

scipy tutorial - Baustatik-Info-Server

scipy tutorial - Baustatik-Info-Server

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Other Parameters:<br />

gcalls<br />

[int] Number of gradient calls made.<br />

hcalls<br />

[int] Number of hessian calls made.<br />

SciPy Reference Guide, Release 0.8.dev<br />

warnflag<br />

[int] Warnings generated by the algorithm. 1 : Maximum number of iterations exceeded.<br />

allvecs<br />

[list] The result at each iteration, if retall is True (see below).<br />

avextol<br />

[float] Convergence is assumed when the average relative error in the minimizer falls below<br />

this amount.<br />

maxiter<br />

[int] Maximum number of iterations to perform.<br />

full_output<br />

[bool] If True, return the optional outputs.<br />

disp<br />

[bool] If True, print convergence message.<br />

retall<br />

[bool] If True, return a list of results at each iteration.<br />

Notes<br />

1. Only one of fhess_p or fhess need to be given. If fhess is provided, then fhess_p will be<br />

ignored. If neither fhess nor fhess_p is provided, then the hessian product will be approximated<br />

using finite differences on fprime. fhess_p must compute the hessian times an arbitrary vector.<br />

If it is not given, finite-differences on fprime are used to compute it. See Wright, and Nocedal<br />

‘Numerical Optimization’, 1999, pg. 140. 2. Check OpenOpt - a tool which offers a unified<br />

syntax to call this and other solvers with possibility of automatic differentiation.<br />

leastsq(func, x0, args=(), Dfun=None, full_output=0, col_deriv=0, ftol=1.49012e-08, xtol=1.49012e-08,<br />

gtol=0.0, maxfev=0, epsfcn=0.0, factor=100, diag=None, warning=True)<br />

Minimize the sum of squares of a set of equations.<br />

x = arg min(sum(func(y)**2,axis=0))<br />

y<br />

Parameters<br />

func : callable<br />

x0 : :<br />

args : :<br />

should take at least one (possibly length N vector) argument and returns M floating<br />

point numbers.<br />

The starting estimate for the minimization.<br />

Any extra arguments to func are placed in this tuple.<br />

Dfun : callable<br />

A function or method to compute the Jacobian of func with derivatives across the<br />

rows. If this is None, the Jacobian will be estimated.<br />

3.12. Optimization and root finding (<strong>scipy</strong>.optimize) 303

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!