11.07.2015 Views

View - Universidad de Almería

View - Universidad de Almería

View - Universidad de Almería

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Proceedings of GO 2005, pp. 91 – 96.Neutral Data Fitting from an Optimisation <strong>View</strong>pointMirjam Dür 1 and Chris Tofallis 21 Department of Mathematics, Darmstadt University of Technology, Schloßgartenstraße 7, D-64289 Darmstadt, Germany,duer@mathematik.tu-darmstadt.<strong>de</strong>2 Department of Management Systems, University of Hertfordshire Business School, Hatfield, Hertfordshire AL10 9AB,United Kingdom, c.tofallis@herts.ac.ukAbstractKeywords:Suppose you are interested in estimating the slope of a line fitted to data points. How should you fitthe line if you want to treat each variable on the same basis? Least squares regression is inappropriatehere since it’s purpose is the prediction of one of the variables. If the variables are switched itprovi<strong>de</strong>s a different slope estimate. We present a method which gives a unique slope and line, andwhich is invariant to changes in units of measurement. We attempt to extend the approach to fittinga linear mo<strong>de</strong>l to multiple variables. In contrast to least squares fitting our method requires thesolution of a global optimisation problem.Data fitting, regression mo<strong>de</strong>ls, fractional programming.1. IntroductionMultiple Neutral Data Fitting is a method to analyse the relationship between a number ofvariables alternative to the well known least squares estimation.Least squares is so popular because it is so easily computed. From an optimisation standpointan unconstrained convex quadratic optimisation problem has to be solved which is doneby setting the partial <strong>de</strong>rivatives of the objective equal to zero. This gives the famous normalequations which have a nice analytic solution.Least squares regression does, however, suffer from several shortcomings: it does not possessseveral properties as listed in [7] that seem <strong>de</strong>sirable for a data fitting method. For example,in or<strong>de</strong>r to apply least squares, the user needs to specify which variables are the in<strong>de</strong>pen<strong>de</strong>ntvariables, and which one is the <strong>de</strong>pen<strong>de</strong>nt variable. A change in this setting will leadto a completely different least squares estimate. In practice, however, a distinction betweenexplanatory and response variables is not always so easy.Another <strong>de</strong>ficiency of least squares fitting is an assumption in the un<strong>de</strong>rlying mo<strong>de</strong>l thatmay be unrealistic in many situations: the mo<strong>de</strong>l is based on the assumption that only the<strong>de</strong>pen<strong>de</strong>nt variable is subject to measurement errors. The in<strong>de</strong>pen<strong>de</strong>nt variables are assumedto be known exactly, a premise that is often not fulfilled.Multiple Neutral Data Fitting is an approach that avoids these shortcomings. The basici<strong>de</strong>a is that a different criterion is chosen as objective of the optimisation problem: Instead ofminimizing the sum of the squares of the residuals, we consi<strong>de</strong>r the <strong>de</strong>viations for each variableand multiply them. As a result, we get a slope estimate different from the least squaressolution. The new estimate possesses nice theoretical properties, but comes with a highercomputational cost when fitting a mo<strong>de</strong>l to multiple variables. In contrast to least squaresregression, our method requires the solution of a global optimisation problem.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!