Book contents
- Frontmatter
- Contents
- Preface
- Acknowledgements
- PART 1 GENESIS OF DATA ASSIMILATION
- PART II DATA ASSIMILATION: DETERMINISTIC/STATIC MODELS
- PART III COMPUTATIONAL TECHNIQUES
- 9 Matrix methods
- 10 Optimization: steepest descent method
- 11 Conjugate direction/gradient methods
- 12 Newton and quasi-Newton methods
- PART IV STATISTICAL ESTIMATION
- PART V DATA ASSIMILATION: STOCHASTIC/STATIC MODELS
- PART VI DATA ASSIMILATION: DETERMINISTIC/DYNAMIC MODELS
- PART VII DATA ASSIMILATION: STOCHASTIC/DYNAMIC MODELS
- PART VIII PREDICTABILITY
- Epilogue
- References
- Index
10 - Optimization: steepest descent method
from PART III - COMPUTATIONAL TECHNIQUES
Published online by Cambridge University Press: 18 December 2009
- Frontmatter
- Contents
- Preface
- Acknowledgements
- PART 1 GENESIS OF DATA ASSIMILATION
- PART II DATA ASSIMILATION: DETERMINISTIC/STATIC MODELS
- PART III COMPUTATIONAL TECHNIQUES
- 9 Matrix methods
- 10 Optimization: steepest descent method
- 11 Conjugate direction/gradient methods
- 12 Newton and quasi-Newton methods
- PART IV STATISTICAL ESTIMATION
- PART V DATA ASSIMILATION: STOCHASTIC/STATIC MODELS
- PART VI DATA ASSIMILATION: DETERMINISTIC/DYNAMIC MODELS
- PART VII DATA ASSIMILATION: STOCHASTIC/DYNAMIC MODELS
- PART VIII PREDICTABILITY
- Epilogue
- References
- Index
Summary
In Chapters 5 and 7 the least squares problem – minimization of the residual norm, f (x) = ∥r(x)∥ where r(x) = (z − Hx) in (5.1.11) and (7.1.3) with respect to the state variable x is formulated. There are essentially two mathematically equivalent approaches to this minimization. In the first, compute the gradient ∇ f (x) and obtain (the minimizer) x by solving ∇ f (x) = 0. We then check if the Hessian ∇2f (x) is positive definite to guarantee that x is indeed a local minimum. In the linear least squares problem in Chapter 5, f (x) is a quadratic function x and hence ∇ f (x) = 0 leads to the solution of a linear system of the type Ax = b with A a symmetric and positive definite matrix (refer to (5.1.17)) which can be solved by the methods described in Chapter 9. In the nonlinear least squares problem, f (x) may be highly nonlinear (far beyond the quadratic nonlinearity). In this case, we can compute x by solving a nonlinear algebraic system given by ∇ f (x) = 0, and then checking for the positive definiteness of the Hessian ∇2f(x). Alternatively, we can approximate f(x) locally around a current operating point, say, xc by a quadratic form Q(y) (using either the first-order or the second-order method described in Chapter 7) where y = (x – xc).
- Type
- Chapter
- Information
- Dynamic Data AssimilationA Least Squares Approach, pp. 169 - 189Publisher: Cambridge University PressPrint publication year: 2006