Vous êtes sur la page 1sur 22

Taxonomy (1)

Taboo Search

UNIT 5 ADVANCES IN SIMULATION Stochastic optimization methods

Monte Carlo methods Simulated Annealing

Evolutionary Algorithms

Genetic Algorithms Evolution Strategies

Genetic Programming Evolutionary Programming

1 4

Contents of the Lectures Taxonomy (2)

• Taxonomy and History; Distinctive features of Evolutionary Algorithms:


• Evolutionary Algorithms basics;
• Theoretical Background; • operate on appropriate encoding of solutions;
• Outline of the various techniques: plain genetic algorithms, • population search;
evolutionary programming, evolution strategies, genetic • no regularity conditions requested;
programming;
• probabilistic transitions.
• Practical implementation issues;
• Evolutionary algorithms and soft computing;
• Selected applications from the biological and medical area;
• Summary and Conclusions.

2 5

Bibliography History (1)


John Koza I. Rechenberg,
Stanford University H.-P. Schwefel
 Th. Bäck. Evolutionary Algorithms in Theory and Practice. Oxford TU Berlin, ‗60s
‗80s
University Press, 1996
 L. Davis. The Handbook of Genetic Algorithms. Van Nostrand &
Reinhold, 1991
 D.B. Fogel. Evolutionary Computation. IEEE Press, 1995
 D.E. Goldberg. Genetic Algorithms in Search, Optimization and
Machine Learning. Addison-Wesley, 1989
 J. Koza. Genetic Programming. MIT Press, 1992
 Z. Michalewicz. Genetic Algorithms + Data Structures = Evolution
Programs. Springer Verlag, 3rd ed., 1996
 H.-P. Schwefel. Evolution and Optimum Seeking. Wiley & Sons,
1995 John H. Holland
 J. Holland. Adaptation in Natural and Artificial Systems. MIT Press L. Fogel University of Michigan,
1995 UC S. Diego, ‗60s Ann Arbor, ‗60s

3 6

1
History (2) Object problem and Fitness

1859 Charles Darwin: inheritance, variation, natural selection genotype solution


1957 G. E. P. Box: random mutation & selection for optimization M s
1958 Fraser, Bremermann: computer simulation of evolution
1964 Rechenberg, Schwefel: mutation & selection
1966 Fogel et al.: evolving automata - ―evolutionary programming‖
1975 Holland: crossover, mutation & selection - ―reproductive plan‖
1975 De Jong: parameter optimization - ―genetic algorithm‖ c: S  R
1989 Goldberg: first textbook
1991 Davis: first handbook
1993 Koza: evolving LISP programs - ―genetic programming‖
f min c( s )
s S
fitness object problem
7 10

Evolutionary Algorithms Basics The Ingredients

• what an EA is (the Metaphor) t reproduction t+1


• object problem and fitness
• the Ingredients
selection
• schemata
• implicit parallelism
• the Schema Theorem
• the building blocks hypothesis
• deception
mutation

recombination

8 11

The Metaphor The Evolutionary Cycle


Selection
Parents
EVOLUTION PROBLEM SOLVING
Reproduction

Recombination
Environment Object problem
Population
Individual Candidate solution Mutation

Fitness Quality
Replacement
Offspring

9 12

2
Pseudocode Fitness Proportionate Selection
f ( )
Probability of  being selected: P ( ) 
f
generation = 0;
SeedPopulation(popSize); // at random or from a file
while(!TerminationCondition())
{
Implementation: ―Roulette Wheel‖ f ( )
2
generation = generation + 1; f

CalculateFitness(); // ... of new genotypes
Selection(); // select genotypes that will reproduce
Crossover(pcross); // mate pcross of them on average
Mutation(pmut); // mutate all the offspring with Bernoulli
// probability pmut over genes
}

13 16

A Sample Genetic Algorithm One Point Crossover

• The MAXONE problem parents offspring


• Genotypes are bit strings
• Fitness-proportionate selection 0 0 0 1 1 1 1 0 1 0 0 0 0 1 0 0 1 1 0 0
• One-point crossover
• Flip mutation (transcription error)
1 0 1 1 0 0 1 1 0 0 1 0 1 1 1 1 1 0 1 0

crossover
point

14 17

The MAXONE Problem Mutation

Problem instance: a string of l binary cells, l:

1 0 1 1 0 0 1 1 0 1

pmut
l
f ( )    i
1 0 1 1 1 0 1 1 0 0
Fitness:

i 1 independent Bernoulli transcription errors

Objective: maximize the number of ones in the string.

15 18

3
Example: Selection Implicit Parallelism

0111011011 f=7 Cf = 7 P = 0.125 In a population of n individuals of length l


1011011101 f=7 Cf = 14 P = 0.125 2l  # schemata processed  n2l
1101100010 f=5 Cf = 19 P = 0.089 n3 of which are processed usefully (Holland 1989)
0100101100 f=4 Cf = 23 P = 0.071
(i.e. are not disrupted by crossover and mutation)
1100110011 f=6 Cf = 29 P = 0.107
1111001000 f=5 Cf = 34 P = 0.089
0110001010 f=4 Cf = 38 P = 0.071
1101011011 f=7 Cf = 45 P = 0.125
0110110000 f=4 Cf = 49 P = 0.071
0011111101 f=7 Cf = 56 P = 0.125
But see Bertoni & Dorigo (1993)
―Implicit Parallelism in Genetic Algorithms‖
Random sequence: 43, 1, 19, 35, 15, 22, 24, 38, 44, 2
Artificial Intelligence 61(2), p. 307314

19 22

Example: Recombination & Mutation Fitness of a schema

0111011011  0111011011  0111111011 f=8 f(): fitness of string 


0111011011  0111011011  0111011011 f=7
110|1100010  1100101100  1100101100 f=5 qx(): fraction of strings equal to  in population x
010|0101100  0101100010  0101100010 f=4
1|100110011  1100110011  1100110011 f=6 qx(S): fraction of strings matched by S in population x
1|100110011  1100110011  1000110011 f=5
0110001010  0110001010  0110001010 f=4
 
1101011011 1101011011 1101011011 f=7
1
011000|1010
110101|1011


0110001011
1101011010


0110001011
1101011010
f=5
f=6 f x (S )   q ( ) f ( )
q x ( S )  S x
TOTAL = 57

20 23

Schemata The Schema Theorem

Don‘t care symbol:  {Xt}t=0,1,... populations at times t

   1 0  1    f X t (S )  f ( X t )
suppose that c is constant
f ( Xt )
order of a schema: o(S) = # fixed positions
defining length (S) = distance between first and last fixed position  ( S ) 
t

E[q X t ( S )| X 0 ]  q X 0 ( S )(1  c) t  1  pcross  o( S ) pmut 


 l 1 
a schema S matches 2l - o(S) strings
a string of length l is matched by 2l schemata
i.e. above-average individuals increase exponentially!

21 24

4
The Schema Theorem (proof) Remedies to deception

Prior knowledge of the objective function

f X t 1 ( S ) Non-deceptive encoding
E[q X t ( S )| X t 1 ]  q X t 1 ( S ) Psurv [ S ]  q X t 1 ( S )(1  c) Psurv [ S ]
f ( X t 1 ) Inversion

Semantics of genes not positional

( S ) Underspecification & overspecification


Psurv [ S ]  1  pcross  pmut o( S )
1 l
―Messy Genetic Algorithms‖

25 28

The Building Blocks Hypothesis Theoretical Background

• Theory of random processes;


• Convergence in probability;
• Open question: rate of convergence.
‗‗An evolutionary algorithm seeks near-optimal performance
through the juxtaposition of short, low-order, high-performance
schemata — the building blocks‘‘

26 29

Deception Events
i.e. when the building block hypothesis does not hold: Sample space

for some schema S, * S but f (S )  f (S ) D


w
Example:
S1 = 111******* W
S2 = ********11 A
* = 1111111111
S = 111*****11
B
S = 000*****00

27 30

5
Random Variables Markov Chains

w A stochastic process  X (w)


t t 0 ,1,
W
Is a Markov chain iff, for all t,

X :W  R P[ X t  x| X 0 , X 1 , , X t 1 ]  P[ X t  x| X t 1 ]
X
0.4 0.7
0.6 A B C 0.75

X (w ) 0 0.3 0.25

31 34

Stochastic Processes Abstract Evolutionary Algorithm


Xt

A sequence of r.v.‘s
Stochastic functions: select

X1 , X 2 ,, X t , select: (n)W


cross: W select
mutate: W mate
mate: W cross
Each with its own probability distribution.
insert: W

Transition function: mutate insert


Notation:  X (w)
t t 0 ,1,
X t 1 (w )  Tt (w ) X t (w )
Xt+1
32 35

EAs as Random Processes Convergence to Optimum


Theorem: if {Xt(w)}t = 0, 1, ... is monotone, homogeneous, x0 is
  ,2 
,  probability space given, y in reach(x0)   (n)O reachable, then

x  ( n ) a sample of size n lim P[ X t  O( n ) | X 0  x 0 ]  1.


t 

trajectory
―random numbers‖
W, F , P  X (w)
t t 0 ,1,
Theorem: if select, mutate are generous, the neighborhood
structure is connective, transition functions Tt(w), t = 0, 1, ... are i.i.d.
evolutionary and elitist, then

  ,2 , 
process

lim P[ X t  O( n ) ]  1.
t 

33 36

6
Outline of various techniques Evolutionary Programming: Individuals
a/a
Finite-state automaton: (Q, q0, A, , w)
• Plain Genetic Algorithms • set of states Q; q0
• Evolutionary Programming • initial state q0; c/c
b/c
• Evolution Strategies • set of accepting states A; b/c c/b
b/a c/a
• alphabet of symbols ; a/b
• Genetic Programming
• transition function : Q   Q; q1 q2
• output mapping function w: Q  ; a/b

state
q0 q1 q2
input
a q0 a q2 b q1 b
b q1 c q1 a q0 c
c q2 b q0 c q2 a

37 40

Plain Genetic Algorithms Evolutionary Programming: Fitness


a b c a b c a b
• Individuals are bit strings
• Mutation as transcription error
• Recombination is crossover
individual 
• Fitness proportionate selection

no
prediction b =?
yes

f() = f() + 1

38 41

Evolutionary Programming Evolutionary Programming: Selection


Variant of stochastic q-tournament selection:
• Individuals are finite-state automata
• Used to solve prediction tasks
2
• State-transition table modified by uniform random mutation
 1 score() = #{i | f() > f(i) }
• No recombination ... q
• Fitness depends on the number of correct predictions
• Truncation selection
Order individuals by decreasing score
Select first half (Truncation selection)

39 42

7
Evolution Strategies Genetic Programming

• Individuals are n-dimensional vectors of reals • Program induction


• Fitness is the objective function • LISP (historically), math expressions, machine language, ...
• Mutation distribution can be part of the genotype • Applications:
(standard deviations and covariances evolve with solutions) – optimal control;
• Multi-parent recombination – planning;
• Deterministic selection (truncation selection) – sequence induction;
– symbolic regression;
– modelling and forecasting;
– symbolic integration and differentiation;
– inverse problems

43 46

Evolution Strategies: Individuals Genetic Programming: The Individuals

subset of LISP S-expressions



candidate solution x OR

 
rotation angles  
1 2 cov(i , j )
a ij 
2
arctan 2
 i   2j
AND AND

 NOT NOT d0 d1
standard deviations 

d0 d1

(OR (AND (NOT d0) (NOT d1)) (AND d0 d1))

44 47

Evolution Strategies: Mutation Genetic Programming: Initialization


OR
OR

 i   i exp(  N (0,1)  N i (0,1)) OR


AND AND

 j   j  N j (0,1) self-adaptation NOT


    
AND
 OR
x   x  N (0,  ,  )
AND AND

 
1
OR
 2 n
NOT NOT d0 d1
Hans-Paul Schwefel suggests:    2n  1
AND AND

  0.0873  5 d0 d1

45 48

8
Sample Application: Myoelectric
Genetic Programming: Crossover
Prosthesis Control
OR OR

NOT AND OR AND


• Control of an upper arm prosthesis
• Genetic Programming application
d0 d0 d1 d1 NOT NOT NOT • Recognize thumb flection, extension and abduction patterns

d0 d0 d1

OR OR

AND AND OR NOT

NOT NOT d0 d1 d1 NOT d0

d0 d1 d0

49 52

Genetic Programming: Other Operators Prosthesis Control: The Context


human 2 electrodes
robot
• Mutation: replace a terminal with a subtree arm arm
• Permutation: change the order of arguments to a function 150 ms myoelectric signals actuator commands
• Editing: simplify S-expressions, e.g. (AND X X)  X
measure convert
• Encapsulation: define a new function using a subtree
raw myo-measurements robot motion
• Decimation: throw away most of the population
preprocess map into goal
myo-signal features human motion

deduce
intentions

50 53

Genetic Programming: Fitness Prosthesis Control: Terminals

Features for electrodes 1, 2:


Fitness cases: j = 1, ..., Ne
• Mean absolute value (MAV)
Ne
• Mean absolute value slope (MAVS)
―Raw‖ fitness: r (  )   Output(  , j )  C ( j )
j 1 • Number of zero crossings (ZC)
• Number of slope sign changes (SC)
―Standardized‖ fitness: s()  [0, +)
• Waveform length (LEN)
1 • Average value (AVG)
―Adjusted‖ fitness: a( )  • Up slope (UP)
1  s(  )
• Down slope (DOWN)
• MAV1/MAV2, MAV2/MAV1
• 0.7, 0.8, 0.9, 1.0, 1.1, 1.2, 1.3, 0.01, -1.0

51 54

9
Prosthesis Control: Function Set Classifier Systems (Michigan approach)
Addition x+y
Subtraction x-y individual: IF X = A AND Y = B THEN Z = D
Multiplication x*y
Division x/y (protected for y=0) (1  e) f n (  )  r  (n)  class(n)
Square root sqrt(|x|) f n 1 (  )  
Sine sin x  (1  p) f n (  )  (n)  class(n)
Cosine cos x
IF ... THEN ...
Tangent tan x (protected for x=/2) IF ... THEN ...
where r  (1  gN  ) R
Natural logarithm ln |x| (protected for x=0) IF ... THEN ...
IF ... THEN ...
Common logarithm log |x| (protected for x=0) IF ... THEN ...

Exponential exp x IF ... THEN ...


IF ... THEN ...
Power function x^y
number of attributes
Reciprocal 1/x (protected for x=0) IF ... THEN ...
IF ... THEN ...
in antecedent part
Absolute value |x|
Integer or truncate int(x)
Sign sign(x)

55 58

Prosthesis Control: Fitness Practical Implementation Issues


type 1 undefined type 2 undefined type 3 undefined
• from elegant academia to not so elegant but robust and efficient
result real-world applications, evolution programs
 • handling constraints
22 signals per motion
• hybridization
spread  • parallel and distributed algorithms

100
r(  )   abduction   extension   flexion 

min  abduction   extension ,  abduction   flexion ,  extension   flexion 

separation

56 59

Myoelectric Prosthesis Control Reference Evolution Programs

• Jaime J. Fernandez, Kristin A. Farry and John B. Cheatham. Slogan:


―Waveform Recognition Using Genetic Programming: The Genetic Algorithms + Data Structures = Evolution Programs
Myoelectric Signal Recognition Problem. GP ‗96, The MIT
Press, pp. 63–71
Key ideas:
• use a data structure as close as possible to object problem
• write appropriate genetic operators
• ensure that all genotypes correspond to feasible solutions
• ensure that genetic operators preserve feasibility

57 60

10
Encodings: “Pie” Problems Penalty Functions
W X Y Z
128 32 90 20
0–255 0–255 0–255 0–255

S c
X = 32/270 = 11.85%

W
X f (  )  Eval(c( z))  P( z)
Y

P( z )  w(t )  wi  i ( z )
Z
P
i

61 64

Encodings: “Permutation” Problems Decoders / Repair Algorithms


Adjacency Representation Matrix Representation
0 1 1 1 1 1 1 1 1
(2, 4, 8, 3, 9, 7, 1, 5, 6) 0 0 1 1 1 1 1 1 1 recombination
0 0 0 0 1 1 1 1 1
0 0 1 0 1 1 1 1 1
Ordinal Representation 0 0 0 0 0 1 1 0 1
0 0 0 0 0 0 1 0 0
0 0 0 0 0 0 0 0 0
(1, 1, 2, 1, 4, 1, 3, 1, 1) 0 0 0 0 1 1 1 0 1 S c
0 0 0 0 0 1 1 0 0
mutation
Path Representation Sorting Representation

(1, 2, 4, 3, 8, 5, 9, 6, 7) (-23, -6, 2, 0, 19, 32, 85, 11, 25)

1-2-4-3-8-5-9-6-7
62 65

Handling Constraints Hybridization

• Penalty functions 1) Seed the population with solutions provided by some heuristics
Risk of spending most of the time evaluating unfeasible solutions,
sticking with the first feasible solution found, or finding an unfeasible heuristics initial population
solution that scores better of feasible solutions
• Decoders or repair algorithms
Computationally intensive, tailored to the particular application 2) Use local optimization algorithms as genetic operators
(Lamarckian mutation)
• Appropriate data structures and specialized genetic operators
All possible genotypes encode for feasible solutions
3) Encode parameters of a heuristics
heuristics candidate solution
genotype

63 66

11
Sample Application: Unit Commitment Unit Commitment: Constraints

• Multiobjective optimization problem: cost VS emission • Power balance requirement


• Many linear and non-linear constraints • Spinning reserve requirement
• Traditionally approached with dynamic programming • Unit maximum and minimum output limits
• Unit minimum up and down times
• Hybrid evolutionary/knowledge-based approach • Power rate limits
• A flexible decision support system for planners • Unit initial conditions
• Solution time increases linearly with the problem size • Unit status restrictions
• Plant crew constraints
• ...

67 70

The Unit Commitment Problem Unit Commitment: Encoding


Unit 1 Unit 2 Unit 3 Unit 4 Time
1.0 0.8 0.2 0.15 00:00
0.9 1.0 0.2 1.0 01:00
0.0 1.0 0.8 0.2 02:00
0.0 0.5 1.0 0.8 03:00
z$   Ci ( Pi )  SU i  SDi  HSi 
n n
zE  E i ( Pi ) 1.0 0.65 0.8 1.0 04:00
i 1 i 1
m 0.8 0.8 0.25 1.0 05:00 Fuzzy
E i ( Pi )   E ij ( Pi ) Ci ( Pi )  ai  bi Pi  ci Pi 2
j 1
1.0 0.4 0.2 1.0 06:00 Knowledge
0.0 0.0 1.0 0.75 07:00 Base
Eij ( Pi )   ij   ij Pi   ij Pi 2
0.5 1.0 1.0 0.8 08:00
Emissions Cost 1.0 0.5 0.0 0.0 09:00

68 71

Predicted Load Curve Unit Commitment: Solution


Unit 1 Unit 2 Unit 3 Unit 4 Time
45 00:00 down
40 01:00 hot-stand-by
35 02:00 starting
30 shutting down
03:00
25 Spinning Reserve up
04:00
20 Load
05:00
15
06:00
10

5
07:00
0 08:00
PM

PM
PM

PM

PM

PM
AM

AM

AM

AM
AM

AM

09:00
0

0
00

00

00

00
00

00

00

00
0

:0

:0
:0

:0

2:

4:

6:

8:
2:

4:

6:

8:

12

10
12

10

69 72

12
Unit Commitment: Selection Terminology

• Panmictic
emission

• Apomictic

cost ($)

competitive selection:

$507,762 $516,511
213,489 £ 60,080 £

73 76

Unit Commitment References Island Model

• D. Srinivasan, A. Tettamanzi. ―An Integrated Framework for


Devising Optimum Generation Schedules‖. In Proceedings of
the 1995 IEEE International Conference on Evolutionary
Computing (ICEC „95), vol. 1, pp. 1-4.
• D. Srinivasan, A. Tettamanzi. A Heuristic-Guided Evolutionary
Approach to Multiobjective Generation Scheduling. IEE
Proceedings Part C - Generation, Transmission, and
Distribution, 143(6):553-559, November 1996.
• D. Srinivasan, A. Tettamanzi. An Evolutionary Algorithm for
Evauation of Emission Compliance Options in View of the Clean
Air Act Amendments. IEEE Transactions on Power Systems,
12(1):336-341, February 1997.

74 77

Selected Applications in Biology and


Parallel Evolutionary Algorithms
Medical Science
• Algoritmo evolutivo standard enunciato come • the protein folding problem, i.e. determining the tertiary structure
sequenziale... of proteins using evolutionary algorithms;
• quantitative structure-activity relationship modeling for drug
• … ma gli algoritmi evolutivi sono intrinsecamente design;
paralleli • applications to medical diagnosis, like electroencephalogram
• Vari modelli: (EEG) classification and automatic feature detection in medical
– algoritmo evolutivo cellulare imagery (PET, CAT, NMR, X-RAY, etc.);
– algoritmo evolutivo parallelo a grana fine (griglia) • applications to radiotherapy treatment planning;
– algoritmo evolutivo parallelo a grana grossa (isole) • applications to myoelectric prosthesis control.
– algoritmo evolutivo sequenziale con calcolo della fitness
parallelo (master - slave)

75 78

13
Sample Application: Protein Folding Protein Folding: Representation
relative move encoding:
• Finding 3-D geometry of a protein to understand its functionality
• Very difficult: one of the ―grand challenge problems‖ UP DOWN FORWARD LEFT UP RIGHT ...
• Standard GA approach
• Simplified protein model

preference order encoding:


UP DOWN FORWARD LEFT
LEFT
RIGHT
LEFT
UP
UP
DOWN
DOWN
FORWARD
...
DOWN FORWARD LEFT UP
FORWARD RIGHT RIGHT RIGHT

79 82

Protein Folding: The Problem Protein Folding: Fitness

• Much of a proteins function may be derived from its Decode: plot the course encoded by the genotype.
conformation (3-D geometry or ―tertiary‖ structure).
• Magnetic resonance & X-ray crystallography are currently used Test each occupied cell:
to view the conformation of a protein: • any collisions: -2;
– expensive in terms of equipment, computation and time; • no collisions AND a hydrophobe in an adjacent cell: 1.
– require isolation, purification and crystallization of protein.
• Prediction of the final folded conformation of a protein chain has Notes:
been shown to be NP-hard. • for each contact: +2;
• adjacent hydrophobes not discounted in the scoring;
• Current approaches:
• multiple collisions (>1 peptides in one cell): -2;
– molecular dynamics modelling (brute force simulation);
• hydrophobe collisions imply an additional penalty (no contacts
– statistical prediction; are scored).
– hill-climbing search techniques (simulated annealing).

80 83

Protein Folding: Simplified Model Protein Folding: Experiments

• 90° lattice (6 degrees of freedom at each point); • Preference ordering encoding;


• Peptides occupy intersections; • Two-point crossover with a rate of 95%;
• No side chains; • Bit mutation with a rate of 0.1%;
• Hydrophobic or hydrophilic (no relative strengths) amino acids; • Population size: 1000 individuals;
• Only hydrophobic/hydrophilic forces considered; • crowding and incest reduction.
• Adjacency considered only in cardinal directions;
• Cross-chain hydrophobic contacts are the basis for evaluation. • Test sequences with known minimum configuration;

81 84

14
Protein Folding References Drug Design: Fitness
target a complement b
• S. Schulze-Kremer. ―Genetic Algorithms for Protein Tertiary k s k s
Structure Prediction‖. PPSN 2, North-Holland 1992.
h g
moving average
• R. Unger and J. Moult. ―A Genetic Algorithm for 3D Protein
ak  i hydropathy bk  i
ik s ik s
Folding Simulations‖. ICGA-5, 1993, pp. 581–588.
• Arnold L. Patton, W. F. Punch III and E. D. Goodman. ―A hydropathy of residues
Standard GA Approach to Native Protein Conformation
Prediction‖. ICGA 6, 1995, pp. 574–581. k  s, ..., n  s n: number of residues in target

(ai  bi ) 2
Q  i (lower Q = better complementarity)
n  2s

85 88

Sample Application: Drug Design Drug Design: Results


Sequence:FANSGNVYFGIIAL Fassina
Hydropathic
Purpose: given a chemical specification (activity), design a tertiary GA
Value
Target
structure complying with it. 4

Requirement: a quantitative structure-activity relationship model.


2

Example: design ligands that can 0

bind targets specifically and


selectively. Complementary -2

peptides.
-4

-6
0 2 4 6 8 10 12 14 16
AminoAcid

86 89

Drug Design: Implementation Drug Design References

• T. S. Lim. A Genetic Algorithms Approach for Drug Design. MS


amino acid (residue) Dissertation, Oxford University, Computing Laboratory, 1995.
• A. L. Parrill. Evolutionary and Genetic Methods in Drug Design.
N L H A F G L F K A individual Drug Discovery Today, Vol. 1, No. 12, Dec 1996, pp. 514–521.
• name
• hydropathic value

Operators:
• Hill-climbing Crossover
implicit selection
• Hill-climbing Mutation
• Reordering (no selection)

87 90

15
Sample Application: Radiotherapy
Sample Application: Medical Diagnosis
Treatment Planning
• Classifier Systems application • X-rays or electron beams for cancer treatment
• Learning by examples • Conformal therapy: uniform dose over cancerous regions, spare
• Lymphography healthy tissues
– 148 examples, 18 attributes, 4 diagnoses • Constrained optimization, inverse problem
– estimated performance of a human expert: 85% correct • From dose specification to beam intensities
• Prognosis of breast cancer recurrence • Constraints:
– 288 examples, 10 attributes, 2 diagnoses – beam intensities are positive
– performance of human expert unknown – rate of intensity change is limited
• Location of primary tumor • Conflicting objectives: Pareto-optimal set of solutions
– 339 examples, 17 attributes, 22 diagnoses
– estimated performance of a human expert: 42% correct

91 94

Medical Diagnosis Results RTP: The Problem


beam
TA: dose delivered to
• Performance indistiguishable from humans treatment area
OAR: dose delivered to
• Performance for breast cancer: about 75% plane of interest
organs at risk
• In primary tumor, patients with identical symptoms have different OHT: dose delivered to
diagnoses other healty
• Symbolic (= comprehensible) diagnosis rules tretment tissues
area
organ at risk
y TA = 100%
OAR < 20%
OHT < 30%
x
head
z
92 95

Medical Diagnosis References RTP: Fitness and Solutions

• Pierre Bonelli, Alexandre Parodi, ―An Efficient Classifier System and its
Experimental Comparison with two Representative learning methods
|OAR - OAR*|

on three medical domains‖. ICGA 4, pp. 288–295.


• Tod A. Sedbrook, Haviland Wright, Richard Wright. ―Application of a C
Genetic Classifier for Patient Triage‖. ICGA 4, pp. 334–338.
A Pareto optimal set
• H. F. Gray, R. J. Maxwell, I. Martínez-Perez, C. Arús, S. Cerdán.
―Genetic Programming Classification of Magnetic Resonance Data‖. B
GP ‗96, p. 424.
• Alejandro Pazos, Julian Dorado, Antonio Santos. ―Detection of Patterns |TA - TA*|
in Radiographs using ANN Designed and Trained with GA‖. GP ‗96, p.
432.

93 96

16
Radiotherapy Treatment Planning
Artificial Neural Networks
References
dendritis
• O. C. L. Haas, K. J. Burnham, M. H. Fisher, J. A. Mills. ―Genetic
Algorithm Applied to Radiotherapy Treatment Planning‖.
ICANNGA ‗95, pp. 432–435.
axon

x1 w
1
x 2 w2

synapsis
y
x n wn
97 100

Evolutionary Algorithms and Soft


Fuzzy Logic
Computing

EAs 1
optimization optimization

monitoring fitness

SC
FL NNs 0

98 101

Soft Computing

• Tolerant of imprecision, uncertainty, and partial truth


• Adaptive EAs
• Methodologies: optimization

– Evolutionary Algorithms
– Neural Networks
– Bayesian and Probabilistic Networks fitness

– Fuzzy Logic
– Rough Sets
• Bio-inspired: Natural Computing FL NNs
• A Scientific Discipline?
• Methodologies co-operate, do not compete (synergy)

99 102

17
Evoluzione pesi e struttura feed-forward
Neural Network Design and Optimization
codifica diretta
• Evolving weights for a network of predefined structure
• Evolving network structure
– direct encoding
– indirect encoding
• Evolving learning rules
• Input data selection

3x3 3x2 2x3 3x1

(3, 2, 3) W0 W1 W2 W3

103 106

Evoluzione pesi e struttura feed-forward


Evoluzione dei pesi (struttura predefinita)
codifica diretta
-0.3 • Operatore di mutazione:
0.2
– rimozione neurone: elimina colonna in W i - 1, riga in W i;
– duplicazione neurone: copia colonna in W i - 1, riga in W i;
– rimozione di un layer con un solo neurone: W Ti - 1 W i;
0.6 0.7 – duplicazione di un layer: inserisci matrice identità;
-0.5 0.4 • Operatore di semplificazione:
– rimuovi neuroni con riga in W i di norma < ;
• Operatore di incrocio:
– scegli due punti di incrocio nei genitori;
– scambia le code;
(0.2, -0.3, 0.6, -0.5, 0.4, 0.7) – collega i pezzi con nuova matrice di pesi casuale

104 107

Evolving the Structure: Direct Encoding Structure Evolution: Direct Encoding


Graph-generating Grammar
1 2 3 4 5 6
1 0 0 0 1 1 0  A B
S   
2 0 0 0 1 0 1 C D
3 0 0 0 0 1 0 c d  a a a a a a
A   , B   , C   , D   
4 0 0 0 0 0 1 a c  a e  a a a b
5 0 0 0 0 0 1 6
 0 0 0 0 1 0  0 1 1 1
a   , b   , c   , d   , e   
6 0 0 0 0 0 0
 0 0 0 1  0 1  0 1 1 1

4 5

(S: A, B, C, D || A: c, d, a, c || B: a, a, a, e || C: a, a, a, a || ... )
1 2 3

105 108

18
Fuzzy Rule-Based Systems

optimization
EAs

monitoring

SC
FL NNs

109 112

Evolutionary Algorithms and Fuzzy Logic Representation of a Fuzzy Rulebase


c1 c2 c3 c4

Fuzzy Government
2
totally overlapping membership functions

Evolutionary Algorithm 10011000 11011010 membership function genes


N
fuzzy fitness 00001010 max = Ndom * Noutput rule genes of value (0 ... Ndom)
input

fuzzy operators 3
input output rules
1 ...
FA1 FA2 FA3 FA1 FA2 FA3 R1 R2 Rmax
Fuzzy Sistem genotype

110 113

A richer representation
Fuzzy System Design and Optimization
Input
membership
• Representation functions
• Genetic operators
Output
• Selection mechanism MFs
• Example: Learning fuzzy classifiers
IF x is A AND v is B THEN F is C
IF a is D THEN F is E
Rules
IF w is G AND x is H THEN F is C
IF true THEN F is K

111 114

19
Initialization Esempio: “Learning fuzzy classifiers”
Input variables no. domains = 1 + exponential(3)


min a b C c d max
Output variables no. domains = 2 + exponential(3)

Rules no. rules = 2 + exponential(6)

IF is AND is AND is AND is THEN is

for each input variable, flip a coin to decide whether to include

115 118

Recombination Controlling the Evolutionary Process

• Motivation:
A rule takes with it
all the referred domains – EAs easy to implement
with their MFs – little specific knowledge required
IF x is A AND v is B THEN F is C
something else
something else
IF a is D THEN F is E
– long computing time
something else IF w is G AND x is H THEN F is C
IF true THEN F is K something else • Features:
– complex dynamics
– non-binary conditions
IF x is A AND v is B THEN F is C
IF a is D THEN F is E – ―intuitive‖ knowledge available
IF w is G AND x is H THEN F is C
IF true THEN F is K

116 119

Mutation Knowledge Acquisition

• {add, remove, change} domain to {input, output} variable; ALGORITHM


• {duplicate, remove} a rule;
• change a rule:
{add, remove, change} a clause in the {antecedent, consequent}
statistics visualization
input MF perturbation:

KNOWLEDGE
a b c d

117 120

20
Fuzzfying Evolutionary Algorithms

• Fuzzy fitness (objective function)


EAs
• Fuzzy encoding
• Fuzzy operators
– recombination
– mutation
FL NNs
• Population Statistics integration

121 124

Fuzzy Fitness Neuro-Fuzzy Systems

• Faster calculation • Fuzzy Neural Networks


– fuzzy neurons (OR, AND, OR/AND)
• Less precision – learning algorithms (backpropagation-style)
• Specific Selection – NEFPROX
– ANFIS
• Co-operative Neuro-Fuzzy Systems
– Adaptive FAMs: differential competitive learning
– Self-Organizing Feature Maps
– Fuzzy ART and Fuzzy ARTMAP

122 125

Fuzzy Government Fuzzy Neural Networks


―Fuzzy rulebase for the dynamic control of an evolutionary
w11
algorithm‖ x1
w12 AND
If D(Xt) is LOW then pmut is HIGH x2 v1
If f (Xt) is LOW and D(Xt) is HIGH then Emerg is NO
... wm1
wm 2 y
OR
Parameters

Statistics

Population w1n vm
xn wmn AND
123 126

21
FAM Systems

( A1  B1 )
( A2  B2 )
x fuzz
 defuzz y

( Ak  Bk )

127

EAs
optimization optimization

monitoring fitness

SC
FL NNs
integration

A. Tettamanzi, M. Tomassini. Soft Computing. Springer-Verlag 2001

128

22

Vous aimerez peut-être aussi