Algorithms for Solving Nonlinear Constrained and Optimization Problems:
The State of the Art

(ps.gz,699K;
pdf, 2387K)

The goal of this document is to summarize the state of the art of algorithms for solving nonlinear constrained and optimization problems. These problems have received attention in different research areas. As a result different approaches exist to solve them. Each of the chapters below attempts to summarize the techniques developed in one particular area.

In Chapter 2 a summary of nonlinear local optimization techniques is given. It covers the state of the art in the area of traditional numerical analysis. With respect to the other techniques described in this document, this area is the oldest and most established one. The algorithms are able to handle large scale nonlinear optimization problems. However, as the title indicates, these algorithms perform local optimization. That is, the solutions they produce are locally optimal, but not necessarily globally optimal. In certain problem classes local optimality implies global optimality. This is for example the case for convex problems. For these problems nonlinear local optimization algorithms can provide good approximations to global optima. Note however that these solutions remain approximations. Indeed local optimization algorithms do not provide conservative bounds on rounding errors.

Many techniques for solving nonlinear problems use derivative information. For a long time derivative information was obtained by numerical approximation. For example finite differences were used to approximate gradients. About 20 years ago a different technique was developed to compute derivatives. This technique called automatic differentiation is able to compute derivatives that are exact up to machine precision, provided the computer codes that define the functions are available. Automatic differentiation has since found its way in the area of numerical analysis. It also is extensively used by interval arithmetic and constraint satisfaction techniques. In this case the derivatives are not evaluated at a given point but over a given box instead. By taking proper care of the rounding error, this process can produce conservative enclosures of the derivatives over the box. In this project we focus on problems that can be stated under the form of arithmetic expressions. In Chapter 3 we review the state of the art in automatic differentiation for this type of problems.

Over the last decade a number of researchers have become interested in the area of nonlinear global optimization. Chapter 4 provides an overview of the techniques that have been developed so far for solving these problems, and put a number of methods, including heuristic ones, in perspective. The chapter summarises a number of results on optimality conditions and certificates which most global optimization methods exploit in one form or another. It discusses various branching methods that can be used and also describes techniques related to linear programming and mixed integer linear programming.

The area of constraint programming solves optimization problems by exploiting the constraints of the problem to eliminate infeasible or suboptimal instantiations. This area has long been focused mainly on discrete problems. The interest in continuous domain problems is relatively recent. In contrast to interval arithmetic or global optimization, this area has mainly concentrated on algorithms for propagating nonlinear constraints. These constraint propagation techniques are reviewed in Chapter 5.

The various techniques described in this document are frequently used in isolation. However, one cannot say that one technique outperforms and is more general than all others. It turns out that the techniques are often complementary in performance and applicability. For example approximations by linear interval systems usually perform well close to a solution whereas propagation techniques are often most effective far from the solution. Also some techniques may not be applicable on some problems. For example linear programming techniques cannot be used to solve nonlinear problems. However, it can be used for the linear part of a nonlinear problem. The state of the art in solver cooperation is presented in Chapter 6.

In a number of engineering applications, it is not possible or desired to find an optimal solution. Instead one wishes to explore the sets of solutions of a problem and proceed to incrementally refine the problem formulation. This is often the case in the area of engineering design for example with non-routine design problems. The constraint propagation techniques described in Chapter 5 are well adapted to solve this type of problems. However, the enclosure of the sets of solutions they provide may not be very accurate. To address this problem a number of complete enumeration techniques have developed. They are reviewed in Chapter 7.