29.04.2013 Views

TESI DOCTORAL - La Salle

TESI DOCTORAL - La Salle

TESI DOCTORAL - La Salle

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

φ (NMI)<br />

1<br />

0.8<br />

0.6<br />

0.4<br />

0.2<br />

0<br />

CSPA<br />

E<br />

RHCA<br />

DHCA<br />

flat<br />

φ (NMI)<br />

1<br />

0.8<br />

0.6<br />

0.4<br />

0.2<br />

0<br />

EAC<br />

E<br />

RHCA<br />

DHCA<br />

flat<br />

φ (NMI)<br />

1<br />

0.8<br />

0.6<br />

0.4<br />

0.2<br />

0<br />

HGPA<br />

E<br />

RHCA<br />

DHCA<br />

flat<br />

φ (NMI)<br />

Chapter 3. Hierarchical consensus architectures<br />

1<br />

0.8<br />

0.6<br />

0.4<br />

0.2<br />

0<br />

MCLA<br />

E<br />

RHCA<br />

DHCA<br />

flat<br />

φ (NMI)<br />

1<br />

0.8<br />

0.6<br />

0.4<br />

0.2<br />

0<br />

ALSAD<br />

E<br />

RHCA<br />

DHCA<br />

flat<br />

φ (NMI)<br />

1<br />

0.8<br />

0.6<br />

0.4<br />

0.2<br />

0<br />

KMSAD<br />

E<br />

RHCA<br />

DHCA<br />

flat<br />

φ (NMI)<br />

1<br />

0.8<br />

0.6<br />

0.4<br />

0.2<br />

0<br />

SLSAD<br />

E<br />

RHCA<br />

DHCA<br />

flat<br />

Figure 3.21: φ (NMI) of the consensus solutions yielded by the computationally optimal<br />

RHCA, DHCA and flat consensus architectures on the Zoo data collection for the diversity<br />

scenario corresponding to a cluster ensemble of size l = 57.<br />

largest inter-consensus architecture deviations are found when consensus clustering is based<br />

on the HGPA consensus function, as the notches of the corresponding φ (NMI) boxes are far<br />

from overlapping. If the consensus functions are compared in terms of the quality of the<br />

clustering solutions they yield, it can be observed that the best results are obtained by the<br />

EAC, ALSAD, KMSAD and SLSAD consensus functions. In these cases, the medians of<br />

the consensus solutions output by the consensus architectures are better than the 75% of<br />

the components of the cluster ensemble E, which denotes a notable level of robustness to<br />

the clustering indeterminacies.<br />

Diversity scenario |df A| =10<br />

Secondly, the quality of the consensus clustering solutions output by the consensus architectures<br />

corresponding to the experiments conducted on the diversity scenario corresponding<br />

to cluster ensembles of size l = 570 are presented in figure 3.22. The trends detected in the<br />

previous diversity scenario are somehow confirmed in this case. That is, those consensus<br />

functions based on evidence accumulation (EAC) and object similarity as data (ALSAD,<br />

KMSAD and SLSAD) yield the best quality consensus clustering solutions, and they show<br />

a high degree of independence with respect to the topology of the consensus architecture.<br />

In fact, EAC and SLSAD based consensus architectures give rise to consensus clusterings<br />

which are better than the 80% of the cluster ensemble components, which again reveals the<br />

ability of these consensus functions to attain clustering solutions robust to the uncertainties<br />

inherent to clustering. In contrast, consensus labelings obtained by hypergraph-based<br />

consensus functions (CSPA, HGPA and MCLA) attain lower φ (NMI) values, while showing<br />

a larger quality variabilty (this only applies to the HGPA and MCLA consensus functions).<br />

Diversity scenario |df A| =19<br />

The results corresponding to the experiments conducted in the third diversity scenario (i.e.<br />

cluster ensembles generated by the compilation of the clusterings output by |dfA| =19<br />

randomly selected clustering algorithms, giving rise to cluster ensembles of size l = 1083)<br />

are presented in figure 3.23. The behaviour detected in the previous diversity scenarios<br />

is also found in this case. Again, the largest inter-consensus architecture variations are<br />

99

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!