Book contents
- Frontmatter
- Contents
- List of contributors
- An invitation to Bayesian nonparametrics
- 1 Bayesian nonparametric methods: motivation and ideas
- 2 The Dirichlet process, related priors and posterior asymptotics
- 3 Models beyond the Dirichlet process
- 4 Further models and applications
- 5 Hierarchical Bayesian nonparametric models with applications
- 6 Computational issues arising in Bayesian nonparametric hierarchical models
- 7 Nonparametric Bayes applications to biostatistics
- 8 More nonparametric Bayesian models for biostatistics
- Author index
- Subject index
1 - Bayesian nonparametric methods: motivation and ideas
Published online by Cambridge University Press: 06 January 2011
- Frontmatter
- Contents
- List of contributors
- An invitation to Bayesian nonparametrics
- 1 Bayesian nonparametric methods: motivation and ideas
- 2 The Dirichlet process, related priors and posterior asymptotics
- 3 Models beyond the Dirichlet process
- 4 Further models and applications
- 5 Hierarchical Bayesian nonparametric models with applications
- 6 Computational issues arising in Bayesian nonparametric hierarchical models
- 7 Nonparametric Bayes applications to biostatistics
- 8 More nonparametric Bayesian models for biostatistics
- Author index
- Subject index
Summary
It is now possible to demonstrate many applications of Bayesian nonparametric methods. It works. It is clear, however, that nonparametric methods are more complicated to understand, use and derive conclusions from, when compared to their parametric counterparts. For this reason it is imperative to provide specific and comprehensive motivation for using nonparametric methods. This chapter aims to do this, and the discussions in this part are restricted to the case of independent and identically distributed (i.i.d.) observations. Although such types of observation are quite specific, the arguments and ideas laid out in this chapter can be extended to cover more complicated types of observation. The usefulness in discussing i.i.d. observations is that the maths is simplified.
Introduction
Even though there is no physical connection between observations, there is a real and obvious reason for creating a dependence between them from a modeling perspective. The first observation, say X1, provides information about the unknown density f from which it came, which in turn provides information about the second observation X2, and so on. How a Bayesian learns is her choice but it is clear that with i.i.d. observations the order of learning should not matter and hence we enter the realms of exchangeable learning models. The mathematics is by now well known (de Finetti, 1937; Hewitt and Savage, 1955) and involves the construction of a prior distribution Π(d f) on a suitable space of density functions.
- Type
- Chapter
- Information
- Bayesian Nonparametrics , pp. 22 - 34Publisher: Cambridge University PressPrint publication year: 2010
- 11
- Cited by