Book contents
- Frontmatter
- Contents
- Preface
- 1 Applications and motivations
- 2 Haar spaces and multivariate polynomials
- 3 Local polynomial reproduction
- 4 Moving least squares
- 5 Auxiliary tools from analysis and measure theory
- 6 Positive definite functions
- 7 Completely monotone functions
- 8 Conditionally positive definite functions
- 9 Compactly supported functions
- 10 Native spaces
- 11 Error estimates for radial basis function interpolation
- 12 Stability
- 13 Optimal recovery
- 14 Data structures
- 15 Numerical methods
- 16 Generalized interpolation
- 17 Interpolation on spheres and other manifolds
- References
- Index
13 - Optimal recovery
Published online by Cambridge University Press: 22 February 2010
- Frontmatter
- Contents
- Preface
- 1 Applications and motivations
- 2 Haar spaces and multivariate polynomials
- 3 Local polynomial reproduction
- 4 Moving least squares
- 5 Auxiliary tools from analysis and measure theory
- 6 Positive definite functions
- 7 Completely monotone functions
- 8 Conditionally positive definite functions
- 9 Compactly supported functions
- 10 Native spaces
- 11 Error estimates for radial basis function interpolation
- 12 Stability
- 13 Optimal recovery
- 14 Data structures
- 15 Numerical methods
- 16 Generalized interpolation
- 17 Interpolation on spheres and other manifolds
- References
- Index
Summary
So far, we have dealt with the following simple interpolation or approximation problem. An in general unknown function f is specified only at certain points X = {x1, …, xN}, and we are interested in recovering the function f on a region Ω that is well covered by the centers X. In a later chapter we will concentrate on more general problems. But let us stick to this particular one a little longer. Why should we use (conditionally) positive definite kernels for recovering f?
We have already learnt that recovering f is a difficult task and that radial basis functions are a powerful tool for doing this. In particular, they can be used (at least theoretically – we come back to the numerical treatment in a later chapter) with truly scattered data and in every dimension. Moreover, positive definite functions appeared quite naturally in the context of reproducing-kernel Hilbert spaces.
But this is not the end of the story. Interpolants based on (conditionally) positive definite kernels are optimal in several other ways and the present chapter is devoted to this subject.
Minimal properties of radial basis functions
Let us start with best approximation. We have seen that the native space NΦ(Ω) corresponding to a (conditionally) positive definite kernel Φ is an adequate function space. The interpolant sf,X is one candidate that uses the given information about f on X, but of course not the only one.
- Type
- Chapter
- Information
- Scattered Data Approximation , pp. 223 - 229Publisher: Cambridge University PressPrint publication year: 2004