11.07.2015 Views

2DkcTXceO

2DkcTXceO

2DkcTXceO

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

284 Nonparametric BayesPrior (25.1) is intuitively appealing in including infinitely-many Gaussiankernels having stochastically decreasing weights. In practice, there will tend tobe a small number of kernels having large weights, with the remaining havingvanishingly small weights. Only a modest number of kernels will be occupiedby the subjects in a sample, so that the effective number of parameters mayactually be quite small and is certainly not infinite, making posterior computationand inferences tractable. Of course, prior (25.1) is only one particularlysimple example (to quote Andrew Gelman, “No one is really interested in densityestimation”), and there has been an explosion of literature in recent yearsproposing an amazing variety of NP Bayes models for broad applications anddata structures.Section 25.2 contains an (absurdly) incomplete timeline of the history ofNP Bayes up through the present. Section 25.3 comments briefly on interestingfuture directions.25.2 A brief history of NP BayesAlthough there were many important earlier developments, the modern viewof nonparametric Bayes statistics was essentially introduced in the papers ofFerguson (1973, 1974), which proposed the Dirichlet process (DP) along withseveral ideal criteria for a nonparametric Bayes approach including large support,interpretability and computational tractability. The DP provides a priorfor a discrete probability measure with infinitely many atoms, and is broadlyemployed within Bayesian models as a prior for mixing distributions and forclustering. An equally popular prior is the Gaussian process (GP), which isinstead used for random functions or surfaces. A non-neglible proportion ofthe nonparametric Bayes literature continues to focus on theoretical properties,computational algorithms and applications of DPs and GPs in variouscontexts.In the 1970s and 1980s, NP Bayes research was primarily theoretical andconducted by a narrow community, with applications focused primarily onjointly conjugate priors, such as simple cases of the gamma process, DP andGP. Most research did not consider applications or data analysis at all, butinstead delved into characterizations and probabilistic properties of stochasticprocesses, which could be employed as priors in NP Bayes models. Thesedevelopments later had substantial applied implications in facilitating computationand the development of richer model classes.With the rise in computing power, development of Gibbs sampling andexplosion in use of Markov chain Monte Carlo (MCMC) algorithms in theearly 1990s, nonparametric Bayes methods started to become computationallytractable. By the late 1990s and early 2000s, there were a rich variety ofinferential algorithms available for general DP mixtures and GP-based mod-

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!