13.07.2015 Views

Thesis - Instituto de Telecomunicações

Thesis - Instituto de Telecomunicações

Thesis - Instituto de Telecomunicações

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

112 CHAPTER 5. FEATURE SELECTION AND CLASSIFICATIONAlgorithm 3: Uncertainty Based Classifier Fusion.Input: observations (x, y); claim: (x, y) ∈ w i ; training data for bootstrap.1 Let g i (x) =p c1 (w i |x) and ĝ i (x), std(ĝ i (x)) <strong>de</strong>note the mean and standard <strong>de</strong>viationof ĝ i (x) at point x estimated by bootstraping.2 Let g i (y) =p c1 (w i |y) and ĝ i (y), std(ĝ i (y)) <strong>de</strong>note the mean and standard <strong>de</strong>viationof ĝ i (y) at point y estimated by bootstraping.3 Define gi u(z) =k[ĝ i(z)+w std(ĝ i (z))], z = x, y, the modified discriminant function,4 with5 w: a weighting parameterk: a normalizing constant, such that ∑ 6 n ci=1 gu i (z) = 17 Fusion <strong>de</strong>cision rule:{ true if guAccept(x, y ∈ w i )=i (x)gi u(y) 1p(w i ) >λ8false otherwise9 where λ is tuned for a particular value of FRR and FAR.This reject option was introduced by Chow [40], and is exten<strong>de</strong>d in [53] by assuming thatmore classes can exist than the ones in the training phase or that some of the classes didn’thave enough information on the training phase. Another extension to the work createdthe concepts of ambiguity rejection (the same <strong>de</strong>fined by Chow), and distance rejection[167], creating a metric to <strong>de</strong>tect if a sample is too far from the classes sample space to beclassified. In [8] the reject option was studied for the K-Nearest-Neighbor (K-NN) classifier.An interesting approach was followed in [95], where instead of rejecting or accepting asample, the concept of group classification is created. For a particular λ there is a set ofclasses to which the sample can be classified with error probability lower than 1 − λ.The uncertainty term was used in classification tasks mainly by using the Dempster-Shafer theory of evi<strong>de</strong>nce [212]. In [5] The Dempspter-Shafer theory is used to combine evi<strong>de</strong>ncefrom several classifiers and use a belief metric computed for each classifier. The termconfi<strong>de</strong>nce measures [185] is also related to uncertainty computation as used in BayesianNetworks. None of these approaches focus on the study of the estimated error probabilityuncertainty for a particular classifier, as proposed here.5.5 ConclusionsWe have presented the set of classification techniques that support our approach for biometricclassification. They have been studied, proposed and <strong>de</strong>veloped for the complex problem

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!