11.07.2015 Views

2DkcTXceO

2DkcTXceO

2DkcTXceO

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

P.J. Bickel 63make. This was first perceived in statistics by Brad Efron, who introducedMonte Carlo in the service of inference by inventing the bootstrap. DavidFreedman and I produced some of the first papers validating the use of thebootstrap in a general context.6.4.1 Semiparametric modelsIn the 1980s new types of data, arising mainly from complex clinical trials,but also astronomy and econometrics, began to appear. These were calledsemiparametric because they needed both finite- and infinite-dimensional parametersfor adequate description.Semiparametric models had been around for some time in the form ofclassical location and regression models as well as in survival analysis andquality control, survey sampling, economics, and to some extent, astronomy.There had been theoretical treatments of various aspects by Ibragimov andKhasminskii, Pfanzagl, and Lucien LeCam at a high level of generality. Thekey idea for their analysis was due to Charles Stein. Chris Klaassen, Ya’acovRitov, Jon Wellner and I were able to present a unified viewpoint on thesemodels, make a key connection to robustness, and develop methods both forsemiparametric performance lower bounds and actual estimation. Our work,of which I was and am still proud, was published in book form in 1993. Muchdevelopment has gone on since then through the efforts of some of my coauthorsand others such as Aad van der Vaart and Jamie Robins. I workedwith Ritov on various aspects of semiparametrics throughout the years, andmention some of that work below.6.4.2 Nonparametric estimation of functionsIn order to achieve the semiparametric lower bounds that we derived it becameclear that restrictions had to be placed on the class of infinite dimensional“nuisance parameters.” In fact, Ritov and I were able to show in a particularsituation, the estimation of the integral of the square of a density, that eventhough the formal lower bounds could be calculated for all densities, efficientestimation of this parameter was possible if and only if the density obeyed aLipschitz condition of order larger than 1/4 and √ n estimation was possibleif and only if the condition had an exponent greater than or equal to 1/4.In the mid-’80s and ’90s David Donoho, Iain Johnstone and their collaboratorsintroduced wavelets to function estimation in statistics; see, e.g.,Donoho et al. (1996). With this motivation they then exploited the Gaussianwhite noise model. This is a generalization of the canonical Gaussian linearmodel, introduced by Ibragimov and Khasminskii, in which one could quantitativelystudy minimax analysis of estimation in complex function spaceswhose definition qualitatively mimics properties of functions encountered inthe real world. Their analysis led rather naturally to regularization by thresholding,a technique which had appeared in some work of mine on procedures

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!