A constrained problem difficult for genetic algorithms,
from the sci-opt newsgroup
min -9x_5-15*x_8+6x_1+16x_2+10(x_6+x_7)
s.t.
x_1+x_2-x_3-x_4 = 0,
0.03x_1+0.01x_2-x_9(x_3+x_4) = 0,
x_3+x_6-x_5 = 0,
x_4+x_7-x_8 = 0,
x_9x_3+0.02x_6-0.025x_5 <= 0,
x_9x_4+0.02x_7-0.015x_8 <= 0,
(0,0,0,0,0,0,0,0,0.01) <= x =< (300,300,100,200,100,300,100,200,0.03)
--------------------
Date: 26 Feb 1996 15:47:27 GMT
From: umeca88@ps.ic.ac.uk (Quanshi Xia)
Subject: Is it HOPELESS for Global Optimization by GA?
Hi,
Last month, I posted the following question about Global Optimization
by GA on this Newsgroup:
I am planning to do the global optimization by Genetic Algorithm (GA).
But it seems difficult to handle the general constraints.
The problem is very general one:
minimize f(x)
x
st. h(x)=0
g(x)<=0
where f(x), h(x) and g(x) may be nonconvex.
I know some guys use the penalty to transform the constrained
optimization into unconstrained one. But I find it can only handle the
very 'simple' constraints, and it does not work for some complicated
constraints. For nonconvex equality and inequlity, even it is
impossible to find a feasible solution. Any ideas and comments? and
references?
So far I received a few responses. The summary is:
From: mchp5ajs Write:
I don't understand your problem, however, more generally, it has been
argued by Nick Radcliffe (2nd author; I forget the first) that
regarding constraint violation as other functions in Pareto
optimization works better than just penalizing one function.
The GA selects solutions that are "not beaten on every front at once"
i.e. a selected solution has no superior in both fitness *and*
constraints unviolated all at the same time. Goldberg's book on GAs
gives refs for Parito optimization.
From: gupta@seas.ucla.edu
How do you guarantee the globality of the
solution obtained through genetic algorithms? I will sincerely
appreciate some references on the topic: about the convergence and
globality of GA's.
From: Maria Cristina Riff
I'm working with the CSP problems, that is Constraint satisfaction
problems, but this kind of problems hasn't a real fitness function to
maximize (or minimize). It exist an approach with CSOP, c.a.d the CSP
with a fitness function to maximize, I think that is you need, non?
The reference is:
TSANG Edward,
Applying Genetic Algorithms to Constraint Satisfaction Optimization
Problems.
Proc ECAI-90, Pitman Publishing, pp 649-654,1990.
From: Olivier Chocron
Just a refrence that might help you:
A.C. Thornton, MIT
Genetic Algorithm versus Simulated Annealing satisfaction of large
sets of Algebraic Mechanical Design Constraints,
AI in Design'94
From: "James A. Larson"
Have you tried Zbigniew Michalewicz's Genocop? It handles function
optimization with constraints. Any function (including nonlinear,
stairstep, etc.). And linear or non linear constraints. The only
limitation is that the decision variables must be continuous. Feel
free to email me if this sounds interesting -- source code and various
papers are available at ftp.uncc.edu directory coe/evol.
From: Zbigniew Michalewicz
Thank you for your mail. Try to get the
requested paper from ftp.uncc.edu, directory coe/evol, file p17.ps.
You can try also paper p20.ps, which is a journal version of the
previous paper (Journal of Heuristics, Vol.1, No.2, pp. 177-206).
From: Roger Wink
You asked for advice on optimizing a function with constraints using a
genetic algorithm. Constraints can be handled several ways:
1) Transformation of coordinates so that the constraints are always
met (e.g., conformal mapping)
2) True/False testing of constraint conditions and assignment of a
harsh penalty if constraints are not met
3) Assignment of a penalty that depends on the extent to which the
constraints are not met.
In our experience, the third approach is best for most cases.
The second approach is usually the worst. The first approach is
tricky, because it can result in a gene string in which the genes
change meaning from place to place in solution space, so crossovers
don't work well sometiumes. If your F(x) is not horribly multimodal,
however, you can almost always scatter initial trial solutions all
over the landscape and hillclimb each one indeppendently to find the
global optimum.
From: New Light Industries
Generator is a commercial genetic algorithm package
from New Light Industries, Ltd. You can download a demo version which
works with Excel spreadsheets, from our home page. We're always happy
to discuss specific applications
-- Thank you all for the contribution. However, I can not still solve
a very simple Nonconvex optimization problem:
min_X -9.0*x_{5}-15.0*x_{8}+6.0*x_{1}+16.0*x_{2}+10.0*(x_{6}+x_{7})
st.
x_{1}+x_{2}-x_{3}-x_{4} = 0
0.03x_{1}+0.01x_{2}-x_{9}(x_{3}+x_{4}) =0
x_{3}+x_{6}-x_{5} =0
x_{4}+x_{7}-x_{8} =0
x_{9}x_{3}+0.02x_{6}-0.025x_{5} \leq 0
x_{9}x_{4}+0.02x_{7}-0.015x_{8} \leq 0
in which
(0,0,0,0,0,0,0,0,0.01) <= X =< (300,300,100,200,100,300,100,200,0.03)
I have tried
(1) multiobjective optimization by violation as secondary objective,
(2) transformed the violation into objective by augmented Laragian
multiplier,
(3) just seeked the feasible point by the violation as objective.
But all failed, even it is impossible to find a feasible solution to
satisfy all constraints. I have tried using the ranking selection,
tournament selection, roulette-wheel selection, truncation selection
and elitsm. It seems GA is HOPELESS to such problem. Before I give up
GA for Global Optimization, I wish someone can give me a final straw.
So any comments and sugestions are welcome!!
Quanshi XIA (Dr.) Centre fro Process Systems Engineering, Imperial College, London SW7 2BY