12.07.2015 Views

1 Introduction

1 Introduction

1 Introduction

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

1.4. The Curse of Dimensionality 37Figure 1.22Plot of the fraction of the volume ofa sphere lying in the range r =1−ɛto r = 1 for various values of thedimensionality D.10.8D =20D =5volume fraction0.60.4D =2D =10.200 0.2 0.4 0.6 0.8 1ɛAlthough the curse of dimensionality certainly raises important issues for patternrecognition applications, it does not prevent us from finding effective techniquesapplicable to high-dimensional spaces. The reasons for this are twofold. First, realdata will often be confined to a region of the space having lower effective dimensionality,and in particular the directions over which important variations in the targetvariables occur may be so confined. Second, real data will typically exhibit somesmoothness properties (at least locally) so that for the most part small changes in theinput variables will produce small changes in the target variables, and so we can exploitlocal interpolation-like techniques to allow us to make predictions of the targetvariables for new values of the input variables. Successful pattern recognition techniquesexploit one or both of these properties. Consider, for example, an applicationin manufacturing in which images are captured of identical planar objects on a conveyorbelt, in which the goal is to determine their orientation. Each image is a pointFigure 1.23Plot of the probability density withrespect to radius r of a Gaussiandistribution for various valuesof the dimensionality D. In ahigh-dimensional space, most of theprobability mass of a Gaussian is locatedwithin a thin shell at a specificradius.p(r)21D =1D =2D =2000 2 4r

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!