13.07.2015 Views

Thesis - Instituto de Telecomunicações

Thesis - Instituto de Telecomunicações

Thesis - Instituto de Telecomunicações

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

104 CHAPTER 5. FEATURE SELECTION AND CLASSIFICATIONthe sequential classifier, using a sequence of length N is expressed as:pe(N) = 1 2(1 + erf(−d √ ))2N4σ(5.9)In figure 5.7 we <strong>de</strong>pict three error evolution cases, where the distance |µ 1 − µ 2 | is alwaysequal to 2 and σ varies :1. σ = 1 — This is the case <strong>de</strong>picted in figure 5.5, with a consi<strong>de</strong>rable overlap and aninitial error of ≈ 15% the error rapidly <strong>de</strong>creases with the sequential samples.2. σ = 2 — in this case the overlap of the two distributions is large, producing an errorhigher than 30%, when using a single sample, that is reduced to 5% with 10 sequentialsamples.3. σ = 3 — the overlap creates a situation where the two classes are almost equal and theone-sample classifier makes an almost random <strong>de</strong>cision (error near 40%). In this casethe sequential classifier would need many samples to provi<strong>de</strong> a<strong>de</strong>quate performance.Figure 5.7: Classification error evolution of the example binary sequential classifier for threestandard <strong>de</strong>viations values. The distance among means is d=2.To create the sequential classifier we consi<strong>de</strong>r each sample at a time, and assume statisticalin<strong>de</strong>pen<strong>de</strong>nce between features, therefore

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!