Académique Documents
Professionnel Documents
Culture Documents
Multi-objective Optimization
Inspired by Nature Darwin’s principle of natural evolution:
survival of the fittest
in populations of individuals (plants, animals), the better the individual is adapted to the
Jürgen Branke environment, the higher its chance for survival and reproduction.
Institute AIFB
University of Karlsruhe, Germany
Karlsruhe Institute of Technology
evolving population
environment
2 | Jürgen Branke | 12. April 2008
Projektbüro KIT | 14. April 08
Appetizer
13245
Evolutionary algorithms 31542
13542
1 | Jürgen Branke | 12. April 2008 3 | Jürgen Branke | 12. April 2008
Basic algorithm Industrial applications
INITIALIZE population 2 10
• Warehouse location problem (Locom)
(set of solutions) 12 1
8 5 7
• Process scheduling (Unilever)
EVALUATE Individuals
according to goal ("fitness") 7 2 • Job shop scheduling (Deer & Company, SAP, Volvo)
• Turbine design (Rolce Royce, Honda)
REPEAT
• Portfolio optimization (First Quadrant)
SELECT parents • Cleaning team assignment (Die Bahn)
RECOMBINE parents (CROSSOVER) • Chip design (Texas Instruments)
• Roboter movement (Honda)
MUTATE offspring • Nuclear fuel reloading (Siemens)
EVALUATE offspring 14 9 17 4
• Design of telephone networks (US West)
• Games (creatures)
FORM next population 9 10
• Military pilot training (British Air Force)
12 17
UNTIL termination-condition 8 14 7 • Vehicle routing (Pina Petroli)
7 4 • Coating of fuorescent lamps (Philips)
• ....
4 | Jürgen Branke | 12. April 2008 6 | Jürgen Branke | 12. April 2008
+ No restriction w.r.t. fitness function (e.g. does not have to be differentiable) • Representation
+ Universal applicability • Genetic operators
+ Easy to integrate heuristic knowledge if available
+ Easy to parallelize • Selection mechanism
+ Easy to use (usually inexpensive to develop) • Crossover/Mutation probability
+ Anytime algorithms (available time is fully utilized) • Population size
+ Can deal with multiple objectives • Stopping criterion
+ User-interaction possible
+ Allow for continuous adaptation
+ Can work with stochastic fitness functions
- Computationally expensive
- No guaranteed solution quality
- Parameter tuning necessary
5 | Jürgen Branke | 12. April 2008 7 | Jürgen Branke | 12. April 2008
Simple example: Travelling Salesman Problem Multiple objectives
f2
C
• It is not always clear which A
solution is better
B
• Let fi, i=1…d be the different
optimization criteria. Then, f1
• Permutation encoding: 3-1-4-5-7-2-6-8-9 a solution x is said to dominate a solution y ( x f y)if and only if the following
• Mutation: Exchange two cities condition is fulfilled:
xf y fi (x) fi (y) i {1...d}
• Order crossover (OX)
select partial sequence from one parent, fill up in order of other parent j : f j (x) < f j (y)
• Among a set of solutions P, the non-dominated set of solutions P’ are those
that are not dominated by any member of the set P
123456789 942836157
• A solution which is not dominated by any other solution in the search space
is called Pareto-optimal.
928 4567 31 • User preferences are required
8 | Jürgen Branke | 12. April 2008 10 | Jürgen Branke | 12. April 2008
Appetizer
Evolutionary algorithms
Multi-Objective Specify preferences Single-Objective
Main course
Problem Problem
Using evolutionary algorithms instead of exact optimizers for MOPs
Multi-objective evolutionary algorithms
evolutionary
algorithm
Including preference information in MOEAs
Optimize
adjust
Desert
preference
Current research information
Summary
Solution
9 | Jürgen Branke | 12. April 2008 11 | Jürgen Branke | 12. April 2008
Specifying preferences Advantages / Disadvantages of EAs
Days London
– Computationally expensive
12 | Jürgen Branke | 12. April 2008 14 | Jürgen Branke | 12. April 2008
a priori
Multi-Objective Specify preferences Single-Objective
Problem Problem
evolutionary algorithm
f2 f2 f2
Multi-objective
Optimize
Optimize
Specify preferences Selected
Pareto Front Solution
f1 f1 f1
Linear weighting w1 Constraints Reference point and a posteriori
achievement scalarizing most EMO approaches
function
w2
13 | Jürgen Branke | 12. April 2008 15 | Jürgen Branke | 12. April 2008
Multiobjective Evolutionary Algorithms (MOEAs) Basic structure
INITIALIZE population 2 10
(set of solutions) 12 1
8 5 7
EVALUATE Individuals
7
• Since EAs work with a population of solutions, they can search for all according to goal ("fitness") 2
MUTATE offspring
EVALUATE offspring 14 9 17 4
16 | Jürgen Branke | 12. April 2008 18 | Jürgen Branke | 12. April 2008
f2 f2
f1 f1
17 | Jürgen Branke | 12. April 2008 19 | Jürgen Branke | 12. April 2008
Non-dominated Sorting GA (NSGA-II) [Deb et al. 2002] NSGA-II: Overall algorithm
Front 1
Other popular approach: old Form new
Front 2 Diversity
Strength Pareto Evolutionary Algorithm (SPEA) by Zitzler pop population
sorting
Non-dominated Front 3
sorting reject
…
off-
spring reject
Front k
20 | Jürgen Branke | 12. April 2008 22 | Jürgen Branke | 12. April 2008
1
Objective 1
21 | Jürgen Branke | 12. April 2008 23 | Jürgen Branke | 12. April 2008
Advantages of finding the complete front Advantages of finding the complete front
+ Not necessary to specify preferences a priori + Allows DM to choose solution after having seen the alternatives
+ Allows DM to choose solution after having seen the alternatives + Interactive search of Pareto front
+ Interactive search of Pareto front + Offer different alternatives to different customers (e.g., mean-variance
portfolio optimization)
+ Offer different alternatives to different customers
(e.g., mean-variance portfolio optimization) + Reveal common properties among Pareto-optimal solutions
(some variables are always the same)
+ Reveal common properties among Pareto-optimal solutions + Understand the causes for the trade-off
(some variables are always the same)
+ Understand the causes for the trade-off + Aid in other optimization tasks (constraints, multi-objectivization)
24 | Jürgen Branke | 12. April 2008 26 | Jürgen Branke | 12. April 2008
• Computational overhead
• Large set of alternatives, difficult to search by DM
25 | Jürgen Branke | 12. April 2008 27 | Jürgen Branke | 12. April 2008
Identifying knees
28 | Jürgen Branke | 12. April 2008 30 | Jürgen Branke | 12. April 2008
martinal utility
29 | Jürgen Branke | 12. April 2008 31 | Jürgen Branke | 12. April 2008
Motivation EMO doesn’t need user preferences.
Does it?
• Although a user generally cannot specify his/her preferences exactly before • All EMO approaches attempt to find a representative set of the Pareto
alternatives are known, he/she usually has some rough idea as to what optimal front
solutions are desired • Usually, representative means well distributed
“A solution should have at least x in objective f1.”
“f1 of x would be good, f1 of y would be great.” 1
“My target solution would look something like this.”
“If a solution is worse by one unit in objective f1, it should be at least x units
better in objective f2 to be interesting.”
“Objective f1 is somewhat more important than objective f2.”
32 | Jürgen Branke | 12. April 2008 34 | Jürgen Branke | 12. April 2008
• Constraint: f1>x
a priori
• Constraints are easy to integrate into EMO
Multi-Objective Specify preferences Single-Objective Lexicographic ordering (feasible solution always dominates infeasible solution)
Problem Problem
pr par Penalty
efe tia
+ ren l
Additional objective
EM ce
O s ...
Optimize
Optimize
pr exa
efe c
re t
nc
es
a posteriori
most EMO approaches
33 | Jürgen Branke | 12. April 2008 35 | Jürgen Branke | 12. April 2008
“... target solution ...” Guided dominance criterion
T2
f2
A 3
T1
1 2
f1
36 | Jürgen Branke | 12. April 2008 38 | Jürgen Branke | 12. April 2008
C
1(x) = f1(x) + w12 f2 (x)
2 (x) = f2 (x) + w21 f1 (x)
x f y i (x) i (y) i {1,2}
B
j : j (x) < j (y)
A
Faster convergence and better coverage of the interesting area of the
Pareto-optimal front
37 | Jürgen Branke | 12. April 2008 39 | Jürgen Branke | 12. April 2008
Marginal utility with preferences
40 | Jürgen Branke | 12. April 2008 42 | Jürgen Branke | 12. April 2008
Biased sharing [Branke & Deb 2004] Light Beam Search [Deb&Kumar 2007]
• User specifies weights and spread parameter • Specify aspiration and reservation point
• Crowding distance calculation is modified • Determine projection on Pareto front
• Identify interesting local area using outranking approaches
41 | Jürgen Branke | 12. April 2008 43 | Jürgen Branke | 12. April 2008
Interactive MOEA Menu
44 | Jürgen Branke | 12. April 2008 46 | Jürgen Branke | 12. April 2008
Current research
45 | Jürgen Branke | 12. April 2008 47 | Jürgen Branke | 12. April 2008
Summary References
• Evolutionary algorithms open new possibilities in multi-objective [Branke et al. 2001] J. Branke, T. Kaußler, H. Schmeck: “Guidance in evolutionary
optimization because multi-objective optimization”. Advances in Engineering Software, 32:499-507
[Branke et al. 2004] J. Branke, K. Deb, H. Dierolf, M. Osswald: “Finding knees in multi-
they are very general problem solvers objective optimization”. Parallel Problem Solving from Nature, Springer, pp. 722-731
they work on a population of solutions and can thus search for a whole set of Branke and Deb Biased
solutions simultaneously [Branke and Deb 2004] J. Branke and K. Deb: “Integrating user preferences into
• Different ways to use EAs in MOO: evolutionary multi-objective optimization”. In Y. Jin, editor, Knowledge Incorporation
1. As single-objective optimizer in classical MOO techniques in Evolutionary Computation, Springer, pages 461–477
2. To generate an approximation to the whole Pareto front [Deb 1999]: “Solving goal programming problems using multi-objective genetic
3. With partial user preferences resulting in a partial front or biased distribution algorithms”. Congress on Evolutionary Computation, IEEE, pp. 77-84
4. Interactively guided by the user [Deb et al. 2002] K. Deb, S. Agrawal, A. Pratap, T. Meyarivan: “A fast and Elitist multi-
objective Genetic Algorithm: NSGA-II”. IEEE Transactions on Evolutionary
Computation 6(2):182-197
[Deb and Kumar 2007] K. Deb and A. Kumar: “Light beam search based multi-
objective optimization using evolutionary algorithms”. Congress on Evolutionary
Computation, IEEE, pp. 2125-2132
[Fonseca and Fleming 1993] C. M. Fonseca and P. J. Fleming: “Genetic algorithms for
multiobjective optimization: Formulation, discussion, and generatization”.
International Conference on Genetic Algorithms, Morgan Kaufmann, pp. 416-423
48 | Jürgen Branke | 12. April 2008 50 | Jürgen Branke | 12. April 2008
EMO resources
Books:
• K. Deb: “Multi-objective optimization using evolutionary algorithms”. Wiley,
2001
• C. Coello Coello, D. A. Van Veldhuizen and G. B. Lamont: “Evolutionary
algorithms for solving multi-objective problems”. Kluwer, 2002
• J. Branke, K. Deb, K. Miettinen, R. Slowinski: “Multi-objective optimization -
interactive and evolutionary approaches”. Springer, to appear
Websites:
• http://www.lania.mx/~coello