Book contents
- Frontmatter
- Contents
- Preface
- 1 Applications and motivations
- 2 Haar spaces and multivariate polynomials
- 3 Local polynomial reproduction
- 4 Moving least squares
- 5 Auxiliary tools from analysis and measure theory
- 6 Positive definite functions
- 7 Completely monotone functions
- 8 Conditionally positive definite functions
- 9 Compactly supported functions
- 10 Native spaces
- 11 Error estimates for radial basis function interpolation
- 12 Stability
- 13 Optimal recovery
- 14 Data structures
- 15 Numerical methods
- 16 Generalized interpolation
- 17 Interpolation on spheres and other manifolds
- References
- Index
4 - Moving least squares
Published online by Cambridge University Press: 22 February 2010
- Frontmatter
- Contents
- Preface
- 1 Applications and motivations
- 2 Haar spaces and multivariate polynomials
- 3 Local polynomial reproduction
- 4 Moving least squares
- 5 Auxiliary tools from analysis and measure theory
- 6 Positive definite functions
- 7 Completely monotone functions
- 8 Conditionally positive definite functions
- 9 Compactly supported functions
- 10 Native spaces
- 11 Error estimates for radial basis function interpolation
- 12 Stability
- 13 Optimal recovery
- 14 Data structures
- 15 Numerical methods
- 16 Generalized interpolation
- 17 Interpolation on spheres and other manifolds
- References
- Index
Summary
The crucial point in local polynomial reproduction is the compact support of the basis functions uj. To be more precise, all supports have to be of the same size. The local support of the uj means that data points far away from the current point of interest x have no influence on the function value at x. This is often a reasonable assumption.
The last chapter did not answer the question how to construct families with local polynomial reproductions efficiently. The moving least squares method provided in this chapter forms an example of this.
Definition and characterization
Suppose again that discrete values of a function f are given at certain data sites X = {x1, …, xN} ⊆ Ω ⊆ ℝd. Throughout this chapter Ω is supposed to satisfy an interior cone condition with angle θ and radius r.
The idea of the moving least squares approximation is to solve for every point x a locally weighted least squares problem. This appears to be quite expensive at first sight, but it will turn out to be a very efficient method. Moreover, in many applications one is only interested in a few evaluations. For such applications the moving least squares approximation is even more attractive, because it is not necessary to set up and solve a large system.
The influence of the data points is governed by a weight function w :Ω × Ω → ℝ, which becomes smaller the further away its arguments are from each other.
- Type
- Chapter
- Information
- Scattered Data Approximation , pp. 35 - 45Publisher: Cambridge University PressPrint publication year: 2004