Book contents
10 - Predictive inference
Published online by Cambridge University Press: 06 July 2010
Summary
The focus of our discussion so far has been inference for the unknown parameter of the probability distribution assumed to have generated the sample data. Sometimes, interest lies instead in assessing the values of future, unobserved values from the same probability distribution, typically the next observation. We saw in Section 3.9 that in a Bayesian approach such prediction is easily accommodated, since there all unknowns are regarded as random variables, so that the distinction between an unknown constant (parameter) and a future observation (random variable) disappears. However, a variety of other approaches to prediction have been proposed.
The prediction problem is as follows. The data x are the observed value of a random variable X with density f (x; θ), and we wish to predict the value of a random variable Z, which, conditionally on X = x, has distribution function G(z | x; θ), depending on θ.
As a simple case, we might have X formed from independent and identically distributed random variables X1, …, Xn, and Z is a further, independent, observation from the same distribution. A more complicated example is that of time series prediction, where the observations are correlated and prediction of a future value depends directly on the observed value as well as on any unknown parameters that have to be estimated. Example 10.2 is a simple case of time series prediction.
- Type
- Chapter
- Information
- Essentials of Statistical Inference , pp. 169 - 189Publisher: Cambridge University PressPrint publication year: 2005