11.07.2015 Views

2DkcTXceO

2DkcTXceO

2DkcTXceO

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

P.G. Hall 165a nonparametric function estimator without also estimating an appropriatesmoothing parameter, for example a bandwidth, from the data. But in the1970s, and indeed for part of the 80s, that was challenging to do withoutusing a mainframe computer in another building and waiting until the nextday to see the results. So theory played a critical role.For example, Mike Woodroofe’s paper (Woodroofe, 1970) on asymptoticproperties of an early plug-in rule for bandwidth choice was seminal, and wasrepresentative of related theoretical contributions over the next decade or so.Methods for smoothing parameter choice for density estimation, using crossvalidationand suggested by Habemma et al. (1974) in a Kullback–Leiblersetting, and by Rudemo (1982) and Bowman (1984) for least squares, werechallenging to implement numerically at the time they were introduced, especiallyin Monte Carlo analysis. However, they were explored enthusiasticallyand in detail using theoretical arguments; see, e.g., Hall (1983, 1987) andStone (1984).Indeed, when applied to a sample of size n, cross-validation requires O(n 2 )computations, and even for moderate sample sizes that could be difficult in asimulation study. We avoided using the Gaussian kernel because of the sheercomputational labour required to compute an exponential. Kernels based ontruncated polynomials, for example the Bartlett–Epanechnikov kernel and thebiweight, were therefore popular.In important respects the development of bootstrap methods was no different.For example, the double bootstrap was out of reach, computationally,for most of us when it was first discussed (Hall, 1986; Beran, 1987, 1988).Hall (1986, p. 1439) remarked of the iterated bootstrap that “it could not beregarded as a general practical tool.” Likewise, the computational challengesposed by even single bootstrap methods motivated a variety of techniquesthat aimed to provide greater efficiency to the operation of sampling froma sample, and appeared in print from the mid 1980s until at least the early1990s. However, efficient methods for bootstrap simulation are seldom usedtoday, so plentiful is the computing power that we have at our disposal.Thus, for the bootstrap, as for problems in function estimation, theoryplayed a role that computation really couldn’t. Asymptotic arguments pointedauthoritatively to the advantages of some bootstrap techniques, and to thedrawbacks associated with others, at a time when reliable numerical corroborationwas hard to come by. The literature of the day contains muted versionsof some of the exciting discussions that took place in the mid to late 1980son this topic. It was an extraordinary time — I feel so fortunate to have beenworking on these problems.I should make the perhaps obvious remark that, even if it had been possibleto address these issues in 1985 using today’s computing resources, theorystill would have provided a substantial and unique degree of authority to thedevelopment of nonparametric methods. In one sweep it enabled us to addressissues in depth in an extraordinarily broad range of settings. It allowed us todiagnose and profoundly understand many complex problems, such as the high

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!