Académique Documents
Professionnel Documents
Culture Documents
Taboo Search
Evolutionary Algorithms
1 4
2 5
3 6
1
History (2) Object problem and Fitness
recombination
8 11
Recombination
Environment Object problem
Population
Individual Candidate solution Mutation
Fitness Quality
Replacement
Offspring
9 12
2
Pseudocode Fitness Proportionate Selection
f ( )
Probability of being selected: P ( )
f
generation = 0;
SeedPopulation(popSize); // at random or from a file
while(!TerminationCondition())
{
Implementation: ―Roulette Wheel‖ f ( )
2
generation = generation + 1; f
CalculateFitness(); // ... of new genotypes
Selection(); // select genotypes that will reproduce
Crossover(pcross); // mate pcross of them on average
Mutation(pmut); // mutate all the offspring with Bernoulli
// probability pmut over genes
}
13 16
crossover
point
14 17
1 0 1 1 0 0 1 1 0 1
pmut
l
f ( ) i
1 0 1 1 1 0 1 1 0 0
Fitness:
15 18
3
Example: Selection Implicit Parallelism
19 22
20 23
1 0 1 f X t (S ) f ( X t )
suppose that c is constant
f ( Xt )
order of a schema: o(S) = # fixed positions
defining length (S) = distance between first and last fixed position ( S )
t
21 24
4
The Schema Theorem (proof) Remedies to deception
f X t 1 ( S ) Non-deceptive encoding
E[q X t ( S )| X t 1 ] q X t 1 ( S ) Psurv [ S ] q X t 1 ( S )(1 c) Psurv [ S ]
f ( X t 1 ) Inversion
25 28
26 29
Deception Events
i.e. when the building block hypothesis does not hold: Sample space
27 30
5
Random Variables Markov Chains
X :W R P[ X t x| X 0 , X 1 , , X t 1 ] P[ X t x| X t 1 ]
X
0.4 0.7
0.6 A B C 0.75
X (w ) 0 0.3 0.25
31 34
A sequence of r.v.‘s
Stochastic functions: select
trajectory
―random numbers‖
W, F , P X (w)
t t 0 ,1,
Theorem: if select, mutate are generous, the neighborhood
structure is connective, transition functions Tt(w), t = 0, 1, ... are i.i.d.
evolutionary and elitist, then
,2 ,
process
lim P[ X t O( n ) ] 1.
t
33 36
6
Outline of various techniques Evolutionary Programming: Individuals
a/a
Finite-state automaton: (Q, q0, A, , w)
• Plain Genetic Algorithms • set of states Q; q0
• Evolutionary Programming • initial state q0; c/c
b/c
• Evolution Strategies • set of accepting states A; b/c c/b
b/a c/a
• alphabet of symbols ; a/b
• Genetic Programming
• transition function : Q Q; q1 q2
• output mapping function w: Q ; a/b
state
q0 q1 q2
input
a q0 a q2 b q1 b
b q1 c q1 a q0 c
c q2 b q0 c q2 a
37 40
no
prediction b =?
yes
f() = f() + 1
38 41
39 42
7
Evolution Strategies Genetic Programming
43 46
rotation angles
1 2 cov(i , j )
a ij
2
arctan 2
i 2j
AND AND
NOT NOT d0 d1
standard deviations
d0 d1
44 47
1
OR
2 n
NOT NOT d0 d1
Hans-Paul Schwefel suggests: 2n 1
AND AND
0.0873 5 d0 d1
45 48
8
Sample Application: Myoelectric
Genetic Programming: Crossover
Prosthesis Control
OR OR
d0 d0 d1
OR OR
d0 d1 d0
49 52
deduce
intentions
50 53
51 54
9
Prosthesis Control: Function Set Classifier Systems (Michigan approach)
Addition x+y
Subtraction x-y individual: IF X = A AND Y = B THEN Z = D
Multiplication x*y
Division x/y (protected for y=0) (1 e) f n ( ) r (n) class(n)
Square root sqrt(|x|) f n 1 ( )
Sine sin x (1 p) f n ( ) (n) class(n)
Cosine cos x
IF ... THEN ...
Tangent tan x (protected for x=/2) IF ... THEN ...
where r (1 gN ) R
Natural logarithm ln |x| (protected for x=0) IF ... THEN ...
IF ... THEN ...
Common logarithm log |x| (protected for x=0) IF ... THEN ...
55 58
100
r( ) abduction extension flexion
min abduction extension , abduction flexion , extension flexion
separation
56 59
57 60
10
Encodings: “Pie” Problems Penalty Functions
W X Y Z
128 32 90 20
0–255 0–255 0–255 0–255
S c
X = 32/270 = 11.85%
W
X f ( ) Eval(c( z)) P( z)
Y
P( z ) w(t ) wi i ( z )
Z
P
i
61 64
1-2-4-3-8-5-9-6-7
62 65
• Penalty functions 1) Seed the population with solutions provided by some heuristics
Risk of spending most of the time evaluating unfeasible solutions,
sticking with the first feasible solution found, or finding an unfeasible heuristics initial population
solution that scores better of feasible solutions
• Decoders or repair algorithms
Computationally intensive, tailored to the particular application 2) Use local optimization algorithms as genetic operators
(Lamarckian mutation)
• Appropriate data structures and specialized genetic operators
All possible genotypes encode for feasible solutions
3) Encode parameters of a heuristics
heuristics candidate solution
genotype
63 66
11
Sample Application: Unit Commitment Unit Commitment: Constraints
67 70
68 71
5
07:00
0 08:00
PM
PM
PM
PM
PM
PM
AM
AM
AM
AM
AM
AM
09:00
0
0
00
00
00
00
00
00
00
00
0
:0
:0
:0
:0
2:
4:
6:
8:
2:
4:
6:
8:
12
10
12
10
69 72
12
Unit Commitment: Selection Terminology
• Panmictic
emission
• Apomictic
cost ($)
competitive selection:
$507,762 $516,511
213,489 £ 60,080 £
73 76
74 77
75 78
13
Sample Application: Protein Folding Protein Folding: Representation
relative move encoding:
• Finding 3-D geometry of a protein to understand its functionality
• Very difficult: one of the ―grand challenge problems‖ UP DOWN FORWARD LEFT UP RIGHT ...
• Standard GA approach
• Simplified protein model
79 82
• Much of a proteins function may be derived from its Decode: plot the course encoded by the genotype.
conformation (3-D geometry or ―tertiary‖ structure).
• Magnetic resonance & X-ray crystallography are currently used Test each occupied cell:
to view the conformation of a protein: • any collisions: -2;
– expensive in terms of equipment, computation and time; • no collisions AND a hydrophobe in an adjacent cell: 1.
– require isolation, purification and crystallization of protein.
• Prediction of the final folded conformation of a protein chain has Notes:
been shown to be NP-hard. • for each contact: +2;
• adjacent hydrophobes not discounted in the scoring;
• Current approaches:
• multiple collisions (>1 peptides in one cell): -2;
– molecular dynamics modelling (brute force simulation);
• hydrophobe collisions imply an additional penalty (no contacts
– statistical prediction; are scored).
– hill-climbing search techniques (simulated annealing).
80 83
81 84
14
Protein Folding References Drug Design: Fitness
target a complement b
• S. Schulze-Kremer. ―Genetic Algorithms for Protein Tertiary k s k s
Structure Prediction‖. PPSN 2, North-Holland 1992.
h g
moving average
• R. Unger and J. Moult. ―A Genetic Algorithm for 3D Protein
ak i hydropathy bk i
ik s ik s
Folding Simulations‖. ICGA-5, 1993, pp. 581–588.
• Arnold L. Patton, W. F. Punch III and E. D. Goodman. ―A hydropathy of residues
Standard GA Approach to Native Protein Conformation
Prediction‖. ICGA 6, 1995, pp. 574–581. k s, ..., n s n: number of residues in target
(ai bi ) 2
Q i (lower Q = better complementarity)
n 2s
85 88
peptides.
-4
-6
0 2 4 6 8 10 12 14 16
AminoAcid
86 89
Operators:
• Hill-climbing Crossover
implicit selection
• Hill-climbing Mutation
• Reordering (no selection)
87 90
15
Sample Application: Radiotherapy
Sample Application: Medical Diagnosis
Treatment Planning
• Classifier Systems application • X-rays or electron beams for cancer treatment
• Learning by examples • Conformal therapy: uniform dose over cancerous regions, spare
• Lymphography healthy tissues
– 148 examples, 18 attributes, 4 diagnoses • Constrained optimization, inverse problem
– estimated performance of a human expert: 85% correct • From dose specification to beam intensities
• Prognosis of breast cancer recurrence • Constraints:
– 288 examples, 10 attributes, 2 diagnoses – beam intensities are positive
– performance of human expert unknown – rate of intensity change is limited
• Location of primary tumor • Conflicting objectives: Pareto-optimal set of solutions
– 339 examples, 17 attributes, 22 diagnoses
– estimated performance of a human expert: 42% correct
91 94
• Pierre Bonelli, Alexandre Parodi, ―An Efficient Classifier System and its
Experimental Comparison with two Representative learning methods
|OAR - OAR*|
93 96
16
Radiotherapy Treatment Planning
Artificial Neural Networks
References
dendritis
• O. C. L. Haas, K. J. Burnham, M. H. Fisher, J. A. Mills. ―Genetic
Algorithm Applied to Radiotherapy Treatment Planning‖.
ICANNGA ‗95, pp. 432–435.
axon
x1 w
1
x 2 w2
synapsis
y
x n wn
97 100
EAs 1
optimization optimization
monitoring fitness
SC
FL NNs 0
98 101
Soft Computing
– Evolutionary Algorithms
– Neural Networks
– Bayesian and Probabilistic Networks fitness
– Fuzzy Logic
– Rough Sets
• Bio-inspired: Natural Computing FL NNs
• A Scientific Discipline?
• Methodologies co-operate, do not compete (synergy)
99 102
17
Evoluzione pesi e struttura feed-forward
Neural Network Design and Optimization
codifica diretta
• Evolving weights for a network of predefined structure
• Evolving network structure
– direct encoding
– indirect encoding
• Evolving learning rules
• Input data selection
(3, 2, 3) W0 W1 W2 W3
103 106
104 107
4 5
(S: A, B, C, D || A: c, d, a, c || B: a, a, a, e || C: a, a, a, a || ... )
1 2 3
105 108
18
Fuzzy Rule-Based Systems
optimization
EAs
monitoring
SC
FL NNs
109 112
Fuzzy Government
2
totally overlapping membership functions
fuzzy operators 3
input output rules
1 ...
FA1 FA2 FA3 FA1 FA2 FA3 R1 R2 Rmax
Fuzzy Sistem genotype
110 113
A richer representation
Fuzzy System Design and Optimization
Input
membership
• Representation functions
• Genetic operators
Output
• Selection mechanism MFs
• Example: Learning fuzzy classifiers
IF x is A AND v is B THEN F is C
IF a is D THEN F is E
Rules
IF w is G AND x is H THEN F is C
IF true THEN F is K
111 114
19
Initialization Esempio: “Learning fuzzy classifiers”
Input variables no. domains = 1 + exponential(3)
min a b C c d max
Output variables no. domains = 2 + exponential(3)
115 118
• Motivation:
A rule takes with it
all the referred domains – EAs easy to implement
with their MFs – little specific knowledge required
IF x is A AND v is B THEN F is C
something else
something else
IF a is D THEN F is E
– long computing time
something else IF w is G AND x is H THEN F is C
IF true THEN F is K something else • Features:
– complex dynamics
– non-binary conditions
IF x is A AND v is B THEN F is C
IF a is D THEN F is E – ―intuitive‖ knowledge available
IF w is G AND x is H THEN F is C
IF true THEN F is K
116 119
KNOWLEDGE
a b c d
117 120
20
Fuzzfying Evolutionary Algorithms
121 124
122 125
Statistics
Population w1n vm
xn wmn AND
123 126
21
FAM Systems
( A1 B1 )
( A2 B2 )
x fuzz
defuzz y
( Ak Bk )
127
EAs
optimization optimization
monitoring fitness
SC
FL NNs
integration
128
22