26.12.2013 Views

Estimators based in adaptively trimming cells in the mixture model

Estimators based in adaptively trimming cells in the mixture model

Estimators based in adaptively trimming cells in the mixture model

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Theorem 2.10.20 <strong>in</strong> [21] leads to this statement, because this class of functions is constituted by sums of<br />

functions verify<strong>in</strong>g <strong>the</strong> uniform entropy condition.<br />

- The class of <strong>the</strong> functions I Aγ (x) ∂ ∂θ log (f θ (x)) + I A c<br />

γ<br />

(x) ∂ ∂θ log ( IP θ<br />

(<br />

A<br />

c<br />

γ<br />

))<br />

where θ ∈ Θδ and γ ∈ K is<br />

a Donsker class.<br />

This statement can be proved by a cha<strong>in</strong> of arguments similar to <strong>the</strong> above, beg<strong>in</strong>n<strong>in</strong>g with <strong>the</strong> fact that<br />

<strong>the</strong> class of functions {I B0 (x) ∂ ∂θ log (f θ (x)) : θ ∈ Θ δ } is a Donsker class of functions. But this follows<br />

from <strong>the</strong> fact that <strong>the</strong> components of <strong>the</strong>se functions are products of IP θ (i/x)I B0 with functions of <strong>the</strong><br />

types 1 π i<br />

, Σ −1<br />

i (x − µ i ) and − 1 2 Σ−1 i<br />

References<br />

+ 1 2 Σ−1 i (x − µ i ) (x − µ i ) ′ Σ −1<br />

i . •<br />

[1] Banfield, J. D. and Raftery, A. E. (1993). Model-<strong>based</strong> Gaussian and non-Gaussian cluster<strong>in</strong>g. Biometrics<br />

49 803821.<br />

[2] Campbell, N.A. and Mahon, R.J. (1974). A multivariate study of variation <strong>in</strong> two species of rock crab<br />

of genus Leptograpsus. Australian J. Zoology 22: 417-425.<br />

[3] Cuesta-Albertos, J.A.; Gordaliza, A. and Matrán, C. (1997). Trimmed k-means: An attempt to<br />

robustify quantizers, Ann. Statist. 25:553-576.<br />

[4] Cuesta-Albertos, J.A.; Matrán, C. and Mayo-Iscar, A. (2006). Trimm<strong>in</strong>g and likelihood: Robust<br />

location and dispersion estimation <strong>in</strong> <strong>the</strong> multivariate <strong>model</strong>. Submitted<br />

[5] Fraley, C. and Raftery, A. E . (1998). How many clusters? Which cluster<strong>in</strong>g method? Answers via<br />

<strong>model</strong>-<strong>based</strong> cluster analysis. The Computer J. 41: 578588.<br />

[6] Gallegos, M.T. (2003). Robust cluster<strong>in</strong>g under general normal assumptions. Available at:<br />

www.fmi.uni-passau.de/forschung/mip-berichte/MIP-0103.ps.<br />

[7] García-Escudero, L.A. and Gordaliza, A. (1999). Robustness properties of k-means and trimmed<br />

k-means. J. Amer. Statist. Assoc. 94:956-969.<br />

[8] García-Escudero, L.A. and Gordaliza, A. (2006). The importance of <strong>the</strong> scales <strong>in</strong> heterogeneous robust<br />

cluster<strong>in</strong>g. To appear <strong>in</strong> Comput. Statist. Data Anal.<br />

[9] Hampel, F. (2002). Some Thoughts about Classification. In Classification, Cluster<strong>in</strong>g and Data Analysis,<br />

eds. Jajuga, K., Sokolowski, A. and Bock, H.H. Spr<strong>in</strong>ger, New York.<br />

[10] Hard<strong>in</strong>, J, and Rocke, D.M. (2004). Outlier detection <strong>in</strong> <strong>the</strong> multiple cluster sett<strong>in</strong>g us<strong>in</strong>g <strong>the</strong> m<strong>in</strong>imum<br />

covariance determ<strong>in</strong>ant estimator. Comput. Statist. and Data Anal., 44: 625-638.<br />

[11] Hathaway, R.J. (1985). A constra<strong>in</strong>ed formulation of Maximum Likelihood Estimation for Normal<br />

Mixture Distributions. Ann. Statist. 13: 795-800.<br />

[12] Hennig, C. (2004). Breakdown po<strong>in</strong>t for maximum likelihood estimators of lacation-scale <strong>mixture</strong>s.<br />

Ann. Statist. 32: 1313-1340.<br />

29

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!