01.03.2013 Views

Applied Statistics Using SPSS, STATISTICA, MATLAB and R

Applied Statistics Using SPSS, STATISTICA, MATLAB and R

Applied Statistics Using SPSS, STATISTICA, MATLAB and R

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

224 6 Statistical Classification<br />

(categorises)<br />

2<br />

ℜ into two decision regions: the upper half plane corresponding to<br />

d(x) > 0 where feature vectors are assigned to ω1; the lower half plane<br />

corresponding to d(x) < 0 where feature vectors are assigned to ω2. The<br />

classification is arbitrary for d(x) = 0.<br />

x 2<br />

o o<br />

o oo<br />

o o o o<br />

o<br />

o o<br />

ω 2<br />

ω 1<br />

+<br />

-<br />

x x<br />

x x<br />

x<br />

x x<br />

x x x<br />

Figure 6.1. Two classes of cases described by two-dimensional feature vectors<br />

(r<strong>and</strong>om variables X1 <strong>and</strong> X2). The black dots are class means.<br />

The generalisation of the linear decision function for a d-dimensional feature<br />

space in<br />

d<br />

ℜ is straightforward:<br />

d ( x)<br />

= w’<br />

x + w , 6.2<br />

0<br />

where w x represents the dot product 1<br />

’<br />

of the weight vector <strong>and</strong> the d-dimensional<br />

feature vector.<br />

The root set of d(x) = 0, the decision surface, or discriminant, is now a linear<br />

d-dimensional surface called a linear discriminant or hyperplane.<br />

Besides the simple linear discriminants, one can also consider using more<br />

complex decision functions. For instance, Figure 6.2 illustrates an example of<br />

two-dimensional classes separated by a decision boundary obtained with a<br />

quadratic decision function:<br />

2 2<br />

( 5 1 4 2 3 1 2 2 2 1 1 0<br />

d x ) = w x + w x + w x x + w x + w x + w .<br />

6.3<br />

Linear decision functions are quite popular, as they are easier to compute <strong>and</strong><br />

have simpler statistical analysis. For this reason in the following we will only deal<br />

with linear discriminants.<br />

1<br />

The dot product x’ y is obtained by adding the products of corresponding elements of the<br />

two vectors x <strong>and</strong> y.<br />

x 1

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!