12.07.2015 Views

coulomb excitation data analysis codes; gosia 2007 - Physics and ...

coulomb excitation data analysis codes; gosia 2007 - Physics and ...

coulomb excitation data analysis codes; gosia 2007 - Physics and ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

δfδ ln M = |M 0| δfδMwhich, using the steepest descent method, gives a new value of M according to:ln |M| =ln|M 0 | − h |M 0 | δfδMExp<strong>and</strong>ing ln |M| into a Taylor series around M 0 <strong>and</strong> retaining only the first order term gives:(4.52)(4.53)M = M 0 − hM02 δf(4.54)δMwhich defines the modified direction of the search. When the gradient+ derivative method is used, bothvectors must be combined according to equation 4.52 <strong>and</strong> multiplied by |M 0 | to obtain the transformeddirection of the search.A scale change of the matrix elements is, in principle, mainly justified if the logarithmic scale is simultaneouslyused for the dependent variables (γ-ray yields etc.). However, even if the dependent variables arenot transformed, the change of scale for the matrix elements, resulting in relative, rather than absolute,variations, can improve the efficiency of the minimization. A typical situation in which fitting the relativechanges is efficient is the one when a strong dependence on a small matrix element determines the stepsize,h, common for a whole set, thus inhibiting the modification of much larger matrix elements if the absolutechanges are used. Using the relative changes, however, one brings the sensitivity to all matrix elements to acommon range, thus improving the simultaneous fit.The minimum of a logarithmically transformed S function does not, in general, coincide with theminimum of the original least-squares statistic. The minimization procedure uses only the direction of searchresulting from the transformation of the dependent variables, if requested, still monitoring the original Sstatistic. The tranformation of the dependent variables therefore should be switched off when the currentsolution is close to the minimum of S.4.5.7 Selection of the Parameters for MinimizationThe gradient-type minimization, as used in GOSIA, tends to vary the parameters according to their influenceon the least-squares statistic S. It is easily underst<strong>and</strong>able, since the most efficient decrease of S primarily isobtained by varying the parameters displaying the strongest influence, measured by the current magnitudeof the respective components of a gradient. With the stepsize, h, being common for the whole set ofparameters, it is clear that unless the strong dependences are already fitted (which results in reductionof their derivatives) the weak dependences practically will not be activated. This is a serious concern inCoulomb <strong>excitation</strong> <strong>analysis</strong>, since the sensitivity of the S function to different matrix elements can varyby orders of magnitude. The attempt to perform the minimization using a full set of the matrix elementsusually means that most of them will come into play only after some number of steps of minimization but allthe necessary derivatives are to be calculated from a very beginning, enormously increasing the time spenton computation without any significant improvement compared to the much faster minimization performedinitially for only a subset of matrix elements. To speed up the process of fitting, GOSIA offers a widerange of both user-defined <strong>and</strong> automatic ways of reducing a number of parameters according to the currentstatus of the minimization. The user may decide first to fix some of the matrix elements included in theinitial setup, but found to have no influence on the processes analyzed. Secondly, the user may specify asubset of the matrix elements to be varied during a current run overriding the selection made initially. Theselection of the free variables for a current run also can be made by the minimization procedure itself, basedon the magnitudes of the absolute values of the gradient components evaluated during a first step of theminimization compared to the user-specified limit. The direction of the search vector, being either a gradientor a gradient+derivative vector, is always normalized to unity, allowing the user to define the limit belowwhich the matrix elements will be locked for a current run if the absolute values of the respective derivativesare below this limit. In addition, some precautions are taken against purely numerical effects, most notablyagainst the situation in which numerical deficiency in evaluating the derivative causes a spurious result. Theminimization procedure in GOSIA stops if either of three user-defined conditions is fulfilled: first, the valueof the S function has dropped below the user-specified limit; second, the user-specified number of steps has50

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!