**We have** run the benchmark on the libraries Lib1, Lib2, and Lib3 of the Coconut Environment benchmark, containing more than 1000 test problems. We have removed some test problems from the 2003 benchmark that had incompatible DAG formats. Thus we have ended up with in total 1286 test problems. Below you find a summary of the benchamrk results and the automatically generated plots of the performance profiles comparing all solvers on the test libraries. The performance measure is based on the objective function value.

**The tested** solvers in alphabetical order are: BARON 8.1.5 (global solver), COCOS (global), COIN with Ipopt 3.6/Bonmin 1.0 (local solver), CONOPT 3 (local), KNITRO 5.1.2 (local), Lindoglobal 6.0 (global), MINOS 5.51 (local), Pathnlp 4.7 (local).

**COCOS** and KNITRO accepted (almost) all test problems. Also the other solvers accepted the majority of the problems. Minos accepted the smallest number of problems, i.e., 81% of the problems. A typical reason why some solvers reject a problem is that the constraints of the objective function could not be evaluated at the starting point x=0 because of the occurrence of expressions like 1/x or log(x). Some solvers like Baron also reject problems in which sin or cos occur in any expression.

**Lindoglobal** has the best score (79%) in the number of correctly claimed global numerical solutions among the global numerical solutions found. COCOS is second with 76%, and Baron is third with 69%. But it should be remarked that Lindoglobal made 15% wrong solution claims as opposed to Baron with 8%. Not surprisingly, the local solvers had only very bad scores in claiming global numerical solutions, since they are not global solvers. On the other hand, they had a low percentage of wrong solutions, between 3% and 8%. The local solvers did not have zero score in claiming global numerical solutions since for some LP problems they are able to claim globality of the solution.

**Baron** has found the most global numerical solutions among all accepted feasible problems. The local solver Coin also performed very well in this respect, almost at the same level as the global solver Lindoglobal. The other solvers are not far behind. New results with updated versions are continuously uploaded here.

**All results** can be downloaded on the Optimization Test Environment DOWNLOAD page. For more details also have a look at the Optimization Test Environment paper: