15.12.2012 Views

scipy tutorial - Baustatik-Info-Server

scipy tutorial - Baustatik-Info-Server

scipy tutorial - Baustatik-Info-Server

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

SciPy Reference Guide, Release 0.8.dev<br />

model.beginlogging(filename[, Enable logging params for each fn evaluation to files named<br />

freq])<br />

‘filename.freq.pickle’, ‘filename.(2*freq).pickle’, ...<br />

model.endlogging() Stop logging param values whenever setparams() is called.<br />

model.clearcache() Clears the interim results of computations depending on the<br />

model.crossentropy(fx[,<br />

log_prior_x, base])<br />

Returns the cross entropy H(q, p) of the empirical<br />

model.dual([params,<br />

ignorepenalty, ignoretest])<br />

Computes the Lagrangian dual L(theta) of the entropy of the<br />

model.fit(K[, algorithm]) Fit the maxent model p whose feature expectations are given<br />

model.grad([params,<br />

ignorepenalty])<br />

Computes or estimates the gradient of the entropy dual.<br />

model.log(params) This method is called every iteration during the optimization process.<br />

model.logparams() Saves the model parameters if logging has been<br />

model.normconst() Returns the normalization constant, or partition function, for the current<br />

model.<br />

model.reset([numfeatures]) Resets the parameters self.params to zero, clearing the cache variables<br />

dependent on them.<br />

model.setcallback([callback, Sets callback functions to be called every iteration, every function<br />

callback_dual, ...])<br />

evaluation, or every gradient evaluation.<br />

model.setparams(params) Set the parameter vector to params, replacing the existing parameters.<br />

model.setsmooth(sigma) Speficies that the entropy dual and gradient should be computed with a<br />

quadratic penalty term on magnitude of the parameters.<br />

model.expectations() The vector E_p[f(X)] under the model p_params of the vector of<br />

model.lognormconst() Compute the log of the normalization constant (partition<br />

model.logpmf() Returns an array indexed by integers representing the<br />

model.pmf_function([f]) Returns the pmf p_theta(x) as a function taking values on the model’s<br />

sample space.<br />

model.setfeaturesandsamplespace(f, Creates a new matrix self.F of features f of all points in the<br />

samplespace)<br />

beginlogging(filename, freq=10)<br />

Enable logging params for each fn evaluation to files named ‘filename.freq.pickle’, ‘filename.(2*freq).pickle’,<br />

... each ‘freq’ iterations.<br />

endlogging()<br />

Stop logging param values whenever setparams() is called.<br />

clearcache()<br />

Clears the interim results of computations depending on the parameters and the sample.<br />

crossentropy(fx, log_prior_x=None, base=2.7182818284590451)<br />

Returns the cross entropy H(q, p) of the empirical distribution q of the data (with the given feature matrix fx)<br />

with respect to the model p. For discrete distributions this is defined as:<br />

H(q, p) = - n^{-1} sum_{j=1}^n log p(x_j)<br />

where x_j are the data elements assumed drawn from q whose features are given by the matrix fx = {f(x_j)},<br />

j=1,...,n.<br />

The ‘base’ argument specifies the base of the logarithm, which defaults to e.<br />

For continuous distributions this makes no sense!<br />

dual(params=None, ignorepenalty=False, ignoretest=False)<br />

Computes the Lagrangian dual L(theta) of the entropy of the model, for the given vector theta=params. Minimizing<br />

this function (without constraints) should fit the maximum entropy model subject to the given<br />

constraints. These constraints are specified as the desired (target) values self.K for the expectations of the<br />

feature statistic.<br />

3.8. Maximum entropy models (<strong>scipy</strong>.maxentropy) 249

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!