Machine Learning 3. Nearest Neighbor and Kernel Methods - ISMLL
Machine Learning 3. Nearest Neighbor and Kernel Methods - ISMLL
Machine Learning 3. Nearest Neighbor and Kernel Methods - ISMLL
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
<strong>Machine</strong> <strong>Learning</strong> / 2. k-<strong>Nearest</strong> <strong>Neighbor</strong> Method<br />
Decision Boundaries<br />
For 1-nearest neighbor, the predictor space is partitioned in<br />
regions of points that are closest to a given data point:<br />
with<br />
region D (x 1 ), region D (x 2 ), . . . , region D (x n )<br />
region D (x) := {x ′ ∈ X | d(x ′ , x) ≤ d(x ′ , x ′′ ) ∀(x ′′ , y ′′ ) ∈ D}<br />
These regions often are called cells, the whole partition a<br />
Voronoi tesselation.<br />
Lars Schmidt-Thieme, Information Systems <strong>and</strong> <strong>Machine</strong> <strong>Learning</strong> Lab (<strong>ISMLL</strong>), Institute BW/WI & Institute for Computer Science, University of Hildesheim<br />
Course on <strong>Machine</strong> <strong>Learning</strong>, winter term 2007 16/48<br />
<strong>Machine</strong> <strong>Learning</strong> / 2. k-<strong>Nearest</strong> <strong>Neighbor</strong> Method<br />
Decision Boundaries<br />
x2<br />
0.0 0.2 0.4 0.6 0.8 1.0<br />
0.0 0.2 0.4 0.6 0.8 1.0<br />
x1<br />
Lars Schmidt-Thieme, Information Systems <strong>and</strong> <strong>Machine</strong> <strong>Learning</strong> Lab (<strong>ISMLL</strong>), Institute BW/WI & Institute for Computer Science, University of Hildesheim<br />
Course on <strong>Machine</strong> <strong>Learning</strong>, winter term 2007 17/48