COCONUT Environment

An open source platform for interval global optimization

Overview

Introduction

The COCONUT Environment is a modular environment for developing solvers for nonlinear continuous global optimization problems. It has an open-source kernel, which can be expanded by commercial and open-source solver components (inference engines). The COCONUT Environment provides rigorous computational tools using interval arithmetic, to promote the development of reliable computational methods.

The first test version of the environment was provided back in 2004 by the COCONUT consortium. Since then it has been under continuous improvement.

The full source of the current development version is available via a subversion repository. In addition, we provide binary compiled versions of the converters and the two available global optimization solvers. For more information on obtaining the COCONUT sources and binaries please visit our Download section.

Top-level tools

We offer two types of software, the COCONUT solvers and converters, that are developed in the COCONUT Environment. These tools can be used even by those who are not interested in developing optimization algorithms.

At present we have two solvers, both based on interval methods and providing fully rigorous solutions. coco_gop_ex is a solver for bound-constrained (continuous) global optimization. cocos is a demo solver for generally constrained problems. See our Solvers page for more information on these software.

The other type of executables we provide are for converting optimization models between various modelling languages (AMPL, GAMS, etc.), software-specific model formats (LGO, GloptLab, GlobSol), and the internal DAG format of COCONUT. The complete list of converters and additional information on using them can be found on the Converters page of this site.

Core components

We designed several utility libraries to help the efficient implementation of problem representation and solving: graphs are implemented using the Vienna Graph Template Library (VGTL). The search database is based on the Vienna DataBase Library (VDBL). Vector and matrix operations, with focus on efficient implementation of interval operations and supporting advanced linear algebra tools, such as LAPACK, are implemented in the Vienna Matrix Template Library (VMTL).

The central component of COCONUT is the application programmer's interface (API), that is designed to make the development of the various module types independent of each other and independent of the internal model representation. It is a collection of open-source C++ classes protected by the LGPL and GPL license models, so that most of it could be used as part of commercial software (special license regulations are contained in the distribution, and they can be read here). It uses the FILIB++ library for interval computations. The solution algorithm is an advanced branch-and-bound scheme which proceeds by working on the search graph, a directed acyclic graph (DAG) of search nodes, each representing an optimization problem, a model. The search nodes come in two flavors: full nodes which record the complete description of a model, and delta nodes which only contain the difference between the model represented by the node and its (then only) parent.

Search Graph
Search graph

The optimization problems stored in the work nodes, which are passed to the various inference engines, are kept as directed acyclic graphs (DAG), as well. This representation has big advantages. Hereby, a complete optimization problem is always represented by a single DAG. The vertices of the graph represent operators similar to computational trees. Constants and variables are sources, objective and constraints are sinks of the DAG.

Every vertex represents a real valued function of n variables. Predefined functions include sum, product, max, min, elementary real functions (exp, log, pow, sqrt, ...), and also some discrete operators like all_diff and count.

For expression graphs (DAG or tree), special forward and backward evaluators are provided. Currently implemented are real function values, function ranges, gradients (real, interval), Hessians (real, interval), and slopes. In the near future evaluators for second order slopes will be provided, as well.

The strategy engine consists of the logic core (``search'') which is essentially the main solution loop. It calls the management modules, the report modules, and the inference engines in a sequence defined by programmable search strategies.

Component framework
Component Framework

The engine can be programmed using a simple strategy language, an interpreted language based on Python. Since it is interpreted, (semi-)interactive and automatic solution processes are possible, and even debugging and single-stepping of strategies is supported. The language is object oriented, garbage collecting, and provides dynamically typed objects. These features make the system easily extendable.

For the solution strategy, the most important class of modules are the inference engines. They provide the computational base for the algorithm, namely methods for problem structure analysis, local optimization, constraint propagation, interval analysis, linear relaxation, convex optimization, bisection, ....

Corresponding to every type of problem change, a class of inference engines is designed: model analysis (e.g. find convex part), model reduction (e.g. pruning, fathoming), model relaxation (e.g. linear relaxation), model splitting (e.g. bisection), model glueing (e.g. undo excessive splitting), computing of local information (e.g. probing, local optimization).

Inference engines calculate changes to a model that do not change the solution set. But they never change the model; the decision to apply the changes if they are considered useful is left to the strategy engine. The inference engines are implemented as subclass of a single C++ base class. In addition, there is a fixed documentation structure defined.

Several state of the art techniques are provided: A wrapper for DONLP3, a general purpose nonlinear local optimizer for continuous variables, STOP, a heuristic starting point generator, Karush-John-Condition generator using symbolic differentiation, Point Verifier for verifying solution points, Exclusion Box generator, calculating an exclusion region around local optima, Interval constraint propagation, Linear Relaxation generator using slopes, CPLEX engine, a wrapper for the state of the art commercial linear programming solver by IBM ILOG, LP_solve engine, a wrapper for the public domain LP solver, XPRESS engine, a wrapper for another state of the art commercial linear programming solver by FICO,Basic Splitter computing difficult variables and split variants, BCS, a box covering solver, Convexity detection, for simple convexity analysis, and more.

Management modules are the interface between the strategy engine and the internal representation of data and modules, taking care of the management of models, resources, initialization, the search graph, the search database, ....

They are provided to make it possible to change the implementation of the search graph and the internal representation of problems without having to change all of the modules. Management modiles just perform some of the changes which have been advertised by inference engines; they never calculate anything.

The final class of modules, called report modules, produce output. Human or machine readable progress indicators, solution reports, the interface to modeling languages (currently only AMPL and GAMS are supported). Among the report modules is also a AMPL to GAMS translator, which supports many but not all AMPL constructs.

Links

The COCONUT environment uses a number of external libraries. The following links point to the respective sources.
A number of inference modules are based on commercial software or on software, which is not public domain. Below are the contact addresses for the corresponding software packages.
A complete interface to the modeling systems AMPL and GAMS are available.

Acknowledgments

This site contains material provided by other partners in the COCONUT project, notably by Arnold Neumaier, Oleg Shcherbina, Eric Monfroy, and Brice Pajot. I also want to express special thanks to Tuan-Viet Nguyen, Xuan-Ha Vu, Boglarka Toth, Tamas Vinko, Knut Petras, and Christian Keil.