20.04.2014 Views

Combining Pattern Classifiers

Combining Pattern Classifiers

Combining Pattern Classifiers

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

30 FUNDAMENTALS OF PATTERN RECOGNITION<br />

Fig. 1.10<br />

Rotated check-board data (100,000 points in each plot).<br />

The Matlab code for N data points is:<br />

function [d,labd]=gendatcb(N,a,alpha);<br />

d=rand(N,2);<br />

d_transformed=[d(:,1)*cos(alpha)-d(:,2)*sin(alpha), ...<br />

d(:,1)*sin(alpha)+d(:,2)*cos(alpha)];<br />

s=ceil(d_transformed(:,1)/a)+floor(d_transformed(:,2)/a);<br />

labd=2-mod(s,2);<br />

1.5.4 Discriminant Functions and Decision Boundaries<br />

The class with the highest posterior probability is the most natural choice for a given<br />

x. Therefore the posterior probabilities can be used directly as the discriminant<br />

functions, that is,<br />

g i (x) ¼ P(v i jx), i ¼ 1, ..., c (1:40)<br />

Hence we rewrite the maximum membership rule (1.3) as<br />

D(x) ¼ v i <br />

[ V , P(v i jx) ¼ max<br />

i¼1,...,c {P(v ijx)} (1:41)<br />

In fact, a set of discriminant functions leading to the same classification regions<br />

would be<br />

g i (x) ¼ P(v i )p(xjv i ), i ¼ 1, ..., c (1:42)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!