Book contents
- Frontmatter
- Contents
- Foreword by Roger Brockett
- Foreword by Matthew Mason
- Preface
- 1 Preview
- 2 Configuration Space
- 3 Rigid-Body Motions
- 4 Forward Kinematics
- 5 Velocity Kinematics and Statics
- 6 Inverse Kinematics
- 7 Kinematics of Closed Chains
- 8 Dynamics of Open Chains
- 9 Trajectory Generation
- 10 Motion Planning
- 11 Robot Control
- 12 Grasping and Manipulation
- 13 Wheeled Mobile Robots
- A Summary of Useful Formulas
- B Other Representations of Rotations
- C Denavit–Hartenberg Parameters
- D Optimization and Lagrange Multipliers
- Bibliography
- Index
D - Optimization and Lagrange Multipliers
Published online by Cambridge University Press: 04 June 2024
- Frontmatter
- Contents
- Foreword by Roger Brockett
- Foreword by Matthew Mason
- Preface
- 1 Preview
- 2 Configuration Space
- 3 Rigid-Body Motions
- 4 Forward Kinematics
- 5 Velocity Kinematics and Statics
- 6 Inverse Kinematics
- 7 Kinematics of Closed Chains
- 8 Dynamics of Open Chains
- 9 Trajectory Generation
- 10 Motion Planning
- 11 Robot Control
- 12 Grasping and Manipulation
- 13 Wheeled Mobile Robots
- A Summary of Useful Formulas
- B Other Representations of Rotations
- C Denavit–Hartenberg Parameters
- D Optimization and Lagrange Multipliers
- Bibliography
- Index
Summary
Suppose that x* ϵ R is a local minimum of a twice-differentiable objective function f(x), f : R → R, in the sense that for all x near x*, we have f(x) ≥ f(x*). We can then expect that the slope of f(x) at x* is zero, i.e.,
and also that
If f is multi-dimensional, i.e., f : Rn → R, and all partial derivatives of f exist up to second-order, then a necessary condition for x* ϵ Rn to be a local minimum is that its gradient be zero:
For example, consider the linear equation Ax = b, where A ϵ Rm×n and b ϵ Rm (m > n) are given. Because there are more constraints (m) than variables (n), in general a solution to Ax = b will not exist. Suppose we seek the x that best approximates a solution, in the sense of satisfying
The first-order necessary condition is given by
If rankA = n then ATA ϵ Rn×n is invertible, and the solution to (D.1) is
Now suppose that we wish to find, among all x ϵ Rn that satisfy g(x) = 0 for some differentiable g : Rn → Rm (typically m ≤ n to ensure that there exists an infinity of solutions to g(x) = 0), the x* that minimizes the objective function f(x). Suppose that x* is a local minimum of f that is also a regular point of the surface parametrized implicitly by g(x) = 0, i.e., x* satisfies g(x*) = 0 and
Then, from the fundamental theorem of linear algebra, it can be shown that there exists some ƛ∗ ϵ Rm (called the Lagrange multiplier) that satisfies
Equation (D.2) together with g(x*) = 0 constitute the first-order necessary conditions for x* to be a feasible local mininum of f(x). Note that these two equations represent n + m equations in the n + m unknowns x and ƛ.
As an example, consider the quadratic objective function f(x) such that
- Type
- Chapter
- Information
- Modern RoboticsMechanics, Planning, and Control, pp. 516 - 518Publisher: Cambridge University PressPrint publication year: 2017