25.12.2013 Views

CRANFIELD UNIVERSITY Eleni Anthippi Chatzimichali ...

CRANFIELD UNIVERSITY Eleni Anthippi Chatzimichali ...

CRANFIELD UNIVERSITY Eleni Anthippi Chatzimichali ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

In order to acquire statistical confidence in the obtained results, permutation testing<br />

was applied on the standalone and integrated datasets of case study 2 using RBF<br />

SVMs and PLS-DA. According to Figure 5-9, for all experimental data under study,<br />

when either linear or nonlinear classifiers are employed, the initial non-permuted<br />

overall accuracies ( ) are found well above the 95% confidence values; more<br />

specifically, the percentages of correctly classified samples ( ) are even greater<br />

than the 99% confidence intervals. For all instances, the -values are equal to 0.01.<br />

Similar to the permutation results of case study 1, RBF SVMs and PLS-DA<br />

demonstrate great differences in their permutation distributions. Based on Figure 5-10<br />

and Figure 5-11, PLS-DA covers wider ranges (larger spread) and greater variability<br />

than SVMs; once more, the lowest and highest permuted values are recorded for PLS-<br />

DA. On the contrary, the SVM distributions demonstrate a smaller spread and hence<br />

greater consistency in the results. In addition, based on the entries of Table 9 and<br />

Table 10, the permutations of the nonlinear SVMs present a higher mean and median<br />

(higher centre) than PLS-DA. Thus, we can conclude that the RBF SVMs constitute<br />

more powerful classifiers in comparison to the PLS-DA models since they generate<br />

consistently higher classification accuracies. Currently, permutation testing (100<br />

independent permutation tests) for a single dataset is completed within a few hours, as<br />

illustrated in Figure 5-12.<br />

129

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!