Book contents
- Frontmatter
- Dedication
- Content
- List of Figures
- List of Tables
- Preface
- Acknowledgments
- 1 Model and Analysis
- 2 Basics of Probability and Tail Inequalities
- 3 Warm-up Problems
- 4 Optimization I: Brute Force and Greedy Strategy
- 5 Optimization II: Dynamic Programming
- 6 Searching
- 7 Multidimensional Searching and Geometric Algorithms
- 8 String Matching and Finger Printing
- 9 Fast Fourier Transform and Applications
- 10 Graph Algorithms
- 11 Maximum Flow and Applications
- 12 NP Completeness and Approximation Algorithms
- 13 Dimensionality Reduction*
- 14 Parallel Algorithms
- 15 Memory Hierarchy and Caching
- 16 Streaming Data Model
- Appendix A Recurrences and Generating Functions
- Bibliography
- Index
4 - Optimization I: Brute Force and Greedy Strategy
Published online by Cambridge University Press: 27 April 2019
- Frontmatter
- Dedication
- Content
- List of Figures
- List of Tables
- Preface
- Acknowledgments
- 1 Model and Analysis
- 2 Basics of Probability and Tail Inequalities
- 3 Warm-up Problems
- 4 Optimization I: Brute Force and Greedy Strategy
- 5 Optimization II: Dynamic Programming
- 6 Searching
- 7 Multidimensional Searching and Geometric Algorithms
- 8 String Matching and Finger Printing
- 9 Fast Fourier Transform and Applications
- 10 Graph Algorithms
- 11 Maximum Flow and Applications
- 12 NP Completeness and Approximation Algorithms
- 13 Dimensionality Reduction*
- 14 Parallel Algorithms
- 15 Memory Hierarchy and Caching
- 16 Streaming Data Model
- Appendix A Recurrences and Generating Functions
- Bibliography
- Index
Summary
Optimization problems are used to model many real-life problems. Therefore, solving these problems is one of the most important goals of algorithm design. A general optimization problem can be defined by specifying a set of constraints that defines a subset in some underlying space (like the Euclidean space) called the feasible subset and an objective function that we are trying to maximize or minimize, as the case may be, over the feasible set. The difficulty of solving such problems typically depends on how ‘complex’ the feasible set and the objective function are. For example, a very important class of optimization problems is linear programming. Here the feasible subset is specified by a set of linear inequalities (in the Euclidean space); the objective function is also linear. A more general class of optimization problems is convex programming, where the feasible set is a convex subset of a Euclidean space and the objective function is also convex. Convex programs (and hence, linear programs) have a nice property that any local optimum is also a global optimum for the objective function. There are a variety of techniques for solving such problems – all of them try to approach a local optimum (which we know would be a global optimum as well). These notions are discussed in greater detail in a later section in this chapter. The more general problem, the so-called non-convex programs, where the objective function and the feasible subset could be arbitrary can be very challenging to solve. In particular, discrete optimization problems, where the feasible subset could be a (large) discrete subset of points falls under this category.
In this chapter, we first discuss some of the most intuitive approaches for solving such problems. We begin with heuristic search approaches, which try to search for an optimal solution by exploring the feasible subset in some principled manner. Subsequently, we introduce the idea of designing algorithms based on the greedy heuristic.
Heuristic Search Approaches
In heuristic search, we explore the search space in a structured manner. Observe that in general, the size of the feasible set (also called the set of feasible solutions) can be infinite.
- Type
- Chapter
- Information
- Design and Analysis of AlgorithmsA Contemporary Perspective, pp. 54 - 91Publisher: Cambridge University PressPrint publication year: 2019
- 1
- Cited by