Vous êtes sur la page 1sur 13

Evolutionary algorithms

Multi-objective Optimization
Inspired by Nature Darwin’s principle of natural evolution:
survival of the fittest
in populations of individuals (plants, animals), the better the individual is adapted to the
Jürgen Branke environment, the higher its chance for survival and reproduction.
Institute AIFB
University of Karlsruhe, Germany
Karlsruhe Institute of Technology

evolving population
environment
2 | Jürgen Branke | 12. April 2008
Projektbüro KIT | 14. April 08

Menu Transfer to optimization

Appetizer
13245
Evolutionary algorithms 31542
13542

Main course 15342


13254
Using evolutionary algorithms instead of exact optimizers for MOPs
population of environment population of
Multi-objective evolutionary algorithms individuals solutions problem
Including preference information in MOEAs
Desert Natural evolution Evolutionary algorithms
Current research • individual • potential solution
Summary • environment • problem
• fitness/how well adapted • cost/quality of solution
• survival of the fittest • good solutions are kept
• mutation • small, random perturbations
• crossover • recombination of partial solutions

1 | Jürgen Branke | 12. April 2008 3 | Jürgen Branke | 12. April 2008
Basic algorithm Industrial applications

INITIALIZE population 2 10
• Warehouse location problem (Locom)
(set of solutions) 12 1
8 5 7
• Process scheduling (Unilever)
EVALUATE Individuals
according to goal ("fitness") 7 2 • Job shop scheduling (Deer & Company, SAP, Volvo)
• Turbine design (Rolce Royce, Honda)
REPEAT
• Portfolio optimization (First Quadrant)
SELECT parents • Cleaning team assignment (Die Bahn)
RECOMBINE parents (CROSSOVER) • Chip design (Texas Instruments)
• Roboter movement (Honda)
MUTATE offspring • Nuclear fuel reloading (Siemens)
EVALUATE offspring 14 9 17 4
• Design of telephone networks (US West)
• Games (creatures)
FORM next population 9 10
• Military pilot training (British Air Force)
12 17
UNTIL termination-condition 8 14 7 • Vehicle routing (Pina Petroli)
7 4 • Coating of fuorescent lamps (Philips)
• ....

4 | Jürgen Branke | 12. April 2008 6 | Jürgen Branke | 12. April 2008

Advantages/Disadvantages Major design decisions

+ No restriction w.r.t. fitness function (e.g. does not have to be differentiable) • Representation
+ Universal applicability • Genetic operators
+ Easy to integrate heuristic knowledge if available
+ Easy to parallelize • Selection mechanism
+ Easy to use (usually inexpensive to develop) • Crossover/Mutation probability
+ Anytime algorithms (available time is fully utilized) • Population size
+ Can deal with multiple objectives • Stopping criterion
+ User-interaction possible
+ Allow for continuous adaptation
+ Can work with stochastic fitness functions

- Computationally expensive
- No guaranteed solution quality
- Parameter tuning necessary

5 | Jürgen Branke | 12. April 2008 7 | Jürgen Branke | 12. April 2008
Simple example: Travelling Salesman Problem Multiple objectives

f2
C
• It is not always clear which A
solution is better
B
• Let fi, i=1…d be the different
optimization criteria. Then, f1
• Permutation encoding: 3-1-4-5-7-2-6-8-9 a solution x is said to dominate a solution y ( x f y)if and only if the following
• Mutation: Exchange two cities condition is fulfilled:
xf y fi (x)  fi (y) i  {1...d}
• Order crossover (OX)
 select partial sequence from one parent, fill up in order of other parent j : f j (x) < f j (y)
• Among a set of solutions P, the non-dominated set of solutions P’ are those
that are not dominated by any member of the set P
123456789 942836157
• A solution which is not dominated by any other solution in the search space
is called Pareto-optimal.
928 4567 31 • User preferences are required

8 | Jürgen Branke | 12. April 2008 10 | Jürgen Branke | 12. April 2008

Menu A priori approach

Appetizer
Evolutionary algorithms
Multi-Objective Specify preferences Single-Objective
Main course
Problem Problem
Using evolutionary algorithms instead of exact optimizers for MOPs
Multi-objective evolutionary algorithms

evolutionary
algorithm
Including preference information in MOEAs

Optimize
adjust
Desert
preference
Current research information
Summary

Solution

9 | Jürgen Branke | 12. April 2008 11 | Jürgen Branke | 12. April 2008
Specifying preferences Advantages / Disadvantages of EAs

Difficult! + Allows to solve problems where no exact methods exist


Example: Tell me which travel plan you prefer!

– Metaheuristics do not guarantee (Pareto-) optimality


– Solutions generated in subsequent iterations may dominate each other
– Adjusting preference information may lead to unexpected results
Days London

Days London
– Computationally expensive

Days Paris Days Paris

12 | Jürgen Branke | 12. April 2008 14 | Jürgen Branke | 12. April 2008

Specifying preferences A posteriori - The power of populations

a priori
Multi-Objective Specify preferences Single-Objective
Problem Problem

evolutionary algorithm
f2 f2 f2

Multi-objective

Optimize
Optimize
Specify preferences Selected
Pareto Front Solution
f1 f1 f1
Linear weighting w1 Constraints Reference point and a posteriori
achievement scalarizing most EMO approaches
function
w2
13 | Jürgen Branke | 12. April 2008 15 | Jürgen Branke | 12. April 2008
Multiobjective Evolutionary Algorithms (MOEAs) Basic structure

INITIALIZE population 2 10
(set of solutions) 12 1
8 5 7
EVALUATE Individuals
7
• Since EAs work with a population of solutions, they can search for all according to goal ("fitness") 2

(a representative subset of) Pareto-optimal solutions in one run REPEAT


SELECT parents
• Single EMO run is usually much more effective than multiple runs with
different objectives RECOMBINE parents (CROSSOVER)

MUTATE offspring

EVALUATE offspring 14 9 17 4

FORM next population 9 10


12 17
UNTIL termination-condition 8 14 7
7 4

16 | Jürgen Branke | 12. April 2008 18 | Jürgen Branke | 12. April 2008

Demo What is a good approximation?

close to the Pareto front wide spread equally distributed

f2 f2

f1 f1

17 | Jürgen Branke | 12. April 2008 19 | Jürgen Branke | 12. April 2008
Non-dominated Sorting GA (NSGA-II) [Deb et al. 2002] NSGA-II: Overall algorithm

Based on two ideas:


1. Pareto ranking: based on Pareto-dominance
2. Crowding distance: mechanism to maintain diversity in the population

Front 1
Other popular approach: old Form new
Front 2 Diversity
Strength Pareto Evolutionary Algorithm (SPEA) by Zitzler pop population
sorting
Non-dominated Front 3
sorting reject

off-
spring reject
Front k

20 | Jürgen Branke | 12. April 2008 22 | Jürgen Branke | 12. April 2008

Pareto ranking + crowding distance Advantages of finding the complete front

+ Not necessary to specify preferences a priori


+ Allows DM to choose solution after having seen the alternatives

+ Interactive search of Pareto front


 Optimization prior to interaction, thus interaction very fast
4  Only non-dominated solutions are presented to the user
Objective 2

 Direct navigation by user is possible


3  Additional information on distribution of solutions along the front may be
provided to the user (nadir point, ideal point, ...)
2

1
Objective 1

21 | Jürgen Branke | 12. April 2008 23 | Jürgen Branke | 12. April 2008
Advantages of finding the complete front Advantages of finding the complete front

+ Not necessary to specify preferences a priori + Allows DM to choose solution after having seen the alternatives
+ Allows DM to choose solution after having seen the alternatives + Interactive search of Pareto front

+ Interactive search of Pareto front + Offer different alternatives to different customers (e.g., mean-variance
portfolio optimization)
+ Offer different alternatives to different customers
(e.g., mean-variance portfolio optimization) + Reveal common properties among Pareto-optimal solutions
(some variables are always the same)
+ Reveal common properties among Pareto-optimal solutions + Understand the causes for the trade-off
(some variables are always the same)
+ Understand the causes for the trade-off + Aid in other optimization tasks (constraints, multi-objectivization)

24 | Jürgen Branke | 12. April 2008 26 | Jürgen Branke | 12. April 2008

Understanding trade-offs Do we really need the whole front?

• Computational overhead
• Large set of alternatives, difficult to search by DM

• Identify “most interesting” regions


• Take into account partial user preferences

• Bias the distribution


• Restrict the distribution

25 | Jürgen Branke | 12. April 2008 27 | Jürgen Branke | 12. April 2008
Identifying knees

• If crowding distance is replaced by marginal utilities,


• Solutions where an improvement in either algorithm focuses on knees
objective leads to a significant worsening of • No preference information necessary
the other objective are more likely to be
preferred -> knee
Knee

28 | Jürgen Branke | 12. April 2008 30 | Jürgen Branke | 12. April 2008

Marginal Utilities [Branke et al. 2004] Menu

• Assume user has linear utility function Appetizer


• Evaluate each solution with expected loss of utility if solution would not be Evolutionary algorithms
there Main course
Using evolutionary algorithms instead of exact optimizers for MOPs
Multi-objective evolutionary algorithms
Including preference information in MOEAs
Desert
Current research
Summary

martinal utility

29 | Jürgen Branke | 12. April 2008 31 | Jürgen Branke | 12. April 2008
Motivation EMO doesn’t need user preferences.
Does it?

• Although a user generally cannot specify his/her preferences exactly before • All EMO approaches attempt to find a representative set of the Pareto
alternatives are known, he/she usually has some rough idea as to what optimal front
solutions are desired • Usually, representative means well distributed
 “A solution should have at least x in objective f1.”
 “f1 of x would be good, f1 of y would be great.” 1
 “My target solution would look something like this.”
 “If a solution is worse by one unit in objective f1, it should be at least x units
better in objective f2 to be interesting.”
 “Objective f1 is somewhat more important than objective f2.”

• Hope: Find a larger variety of more interesting solutions more quickly.


0 10

• But: distribution depends on scaling of the objectives

32 | Jürgen Branke | 12. April 2008 34 | Jürgen Branke | 12. April 2008

Considering partial preferences “... at least x in objective f1.”

• Constraint: f1>x
a priori
• Constraints are easy to integrate into EMO
Multi-Objective Specify preferences Single-Objective  Lexicographic ordering (feasible solution always dominates infeasible solution)
Problem Problem
pr par  Penalty
efe tia
+ ren l
 Additional objective
EM ce
O s  ...
Optimize
Optimize

Biased Pareto Front

pr exa
efe c
re t
nc
es

Specify preferences Selected


Pareto Front Solution

a posteriori
most EMO approaches

33 | Jürgen Branke | 12. April 2008 35 | Jürgen Branke | 12. April 2008
“... target solution ...” Guided dominance criterion

1. Minimize distance to ideal solution (single objective)


2. Minimize maximal distance in any objective (single objective)
3. Goal Attainment [Fonseca & Fleming, 1993]/Goal Programming [Deb, 1999]
B
 Do not reward improvement over ideal solution
f1 -> max{0, f1-f1*}

T2

f2
A 3

T1

1 2
f1

36 | Jürgen Branke | 12. April 2008 38 | Jürgen Branke | 12. April 2008

“... at least x units better in objective f2 ...” The effect of guidance

• Maximal and minimal trade-offs


standard MOEA guided MOEA
• Guided MOEA [Branke et al. 2001]
• Modify definition of dominance

• Can be achieved by a simple transformation of the objectives


• Not so easy for more than
2 objectives

C
1(x) = f1(x) + w12 f2 (x)
2 (x) = f2 (x) + w21 f1 (x)
x f y  i (x)  i (y) i  {1,2}
B
j :  j (x) <  j (y)
A
Faster convergence and better coverage of the interesting area of the
Pareto-optimal front

37 | Jürgen Branke | 12. April 2008 39 | Jürgen Branke | 12. April 2008
Marginal utility with preferences

• With non-uniform distribution of utility functions

40 | Jürgen Branke | 12. April 2008 42 | Jürgen Branke | 12. April 2008

Biased sharing [Branke & Deb 2004] Light Beam Search [Deb&Kumar 2007]

• User specifies weights and spread parameter • Specify aspiration and reservation point
• Crowding distance calculation is modified • Determine projection on Pareto front
• Identify interesting local area using outranking approaches

41 | Jürgen Branke | 12. April 2008 43 | Jürgen Branke | 12. April 2008
Interactive MOEA Menu

• Narrow down / refocus search during MOEA run Appetizer


Evolutionary algorithms
• Explicitly by Main course
 adjusting constraints Using evolutionary algorithms instead of exact optimizers for MOPs
 moving the target Multi-objective evolutionary algorithms
 modifying the max/min trade-offs Including preference information in MOEAs
 ... Desert
Current research
• Implicitly by comparing solutions
Summary
 learn user preferences

44 | Jürgen Branke | 12. April 2008 46 | Jürgen Branke | 12. April 2008

Current research

• Many-objective problems (difficulty: almost all solutions non-dominated)


• Multiobjectivization (influence diversity and search space structure)
• Noisy objective functions (e.g., stochastic simulation)
• Worst-case multi-objective optimization
• Using metamodels in case of expensive fitness evaluation
• Individual = Set of solutions

45 | Jürgen Branke | 12. April 2008 47 | Jürgen Branke | 12. April 2008
Summary References

• Evolutionary algorithms open new possibilities in multi-objective [Branke et al. 2001] J. Branke, T. Kaußler, H. Schmeck: “Guidance in evolutionary
optimization because multi-objective optimization”. Advances in Engineering Software, 32:499-507
[Branke et al. 2004] J. Branke, K. Deb, H. Dierolf, M. Osswald: “Finding knees in multi-
 they are very general problem solvers objective optimization”. Parallel Problem Solving from Nature, Springer, pp. 722-731
 they work on a population of solutions and can thus search for a whole set of Branke and Deb Biased
solutions simultaneously [Branke and Deb 2004] J. Branke and K. Deb: “Integrating user preferences into
• Different ways to use EAs in MOO: evolutionary multi-objective optimization”. In Y. Jin, editor, Knowledge Incorporation
1. As single-objective optimizer in classical MOO techniques in Evolutionary Computation, Springer, pages 461–477
2. To generate an approximation to the whole Pareto front [Deb 1999]: “Solving goal programming problems using multi-objective genetic
3. With partial user preferences resulting in a partial front or biased distribution algorithms”. Congress on Evolutionary Computation, IEEE, pp. 77-84
4. Interactively guided by the user [Deb et al. 2002] K. Deb, S. Agrawal, A. Pratap, T. Meyarivan: “A fast and Elitist multi-
objective Genetic Algorithm: NSGA-II”. IEEE Transactions on Evolutionary
Computation 6(2):182-197
[Deb and Kumar 2007] K. Deb and A. Kumar: “Light beam search based multi-
objective optimization using evolutionary algorithms”. Congress on Evolutionary
Computation, IEEE, pp. 2125-2132
[Fonseca and Fleming 1993] C. M. Fonseca and P. J. Fleming: “Genetic algorithms for
multiobjective optimization: Formulation, discussion, and generatization”.
International Conference on Genetic Algorithms, Morgan Kaufmann, pp. 416-423
48 | Jürgen Branke | 12. April 2008 50 | Jürgen Branke | 12. April 2008

EMO resources

Books:
• K. Deb: “Multi-objective optimization using evolutionary algorithms”. Wiley,
2001
• C. Coello Coello, D. A. Van Veldhuizen and G. B. Lamont: “Evolutionary
algorithms for solving multi-objective problems”. Kluwer, 2002
• J. Branke, K. Deb, K. Miettinen, R. Slowinski: “Multi-objective optimization -
interactive and evolutionary approaches”. Springer, to appear

Websites:
• http://www.lania.mx/~coello

49 | Jürgen Branke | 12. April 2008

Vous aimerez peut-être aussi