Vous êtes sur la page 1sur 112

Introductory Workshop on

Evolutionary Computing
Part I: Introduction to
Evolutionary Algorithms
Dr. Daniel Tauritz
Director, Natural Computation Laboratory
Associate Professor, Department of Computer Science
Research Investigator, Intelligent Systems Center
Collaborator, Energy Research & Development Center

Real-world optimization problems are
typically characterized by huge, ill-behaved
solution spaces
Infeasible to exhaustively search
Defy traditional (gradient-based) optimization
algorithms because they are non-linear, nondifferentiable, non-continuous, or non-convex

Real-World Example
Electric Power Transmission Systems
Supply is not keeping up with demand
Expansion hampered by:
Social, environmental, and economic
Transmission system is stressed
Already carrying more than intended
Dramatic increase in incidence reports

The Grid

The Grid: Failure

The Grid: Redistribution

The Grid: A Cascade

The Grid: Redistribution

The Grid: Unsatisfiable

The Grid: Unsatisfiable

Failure Analysis
Failure spreads relatively quickly
Too quickly for conventional control
Cascade may be avoidable
Utilize unused capacities (flow
Unsatisfiable condition may be avoidable
Better power flow control to reduce severity

Possible Solution
Strategically place a number of power
flow control devices
Flexible A/C Transmission System
(FACTS) devices are a promising type
of high-speed power-electronics power
flow control devices
Unified Power Flow Controller (UPFC)

FACTS Interaction Laboratory



HIL Line

The placement optimization problem

UPFCs are extremely expensive, so only a limited
number can be placed
Placement is a combinatorial problem
Given 1000 high-voltage lines and 10 UPFCs,
there are 1000C10 total possible placements (about
2.6 x 1023)
If each placement is evaluated in 1 minute, then it
will take about 5 x 1015 centuries to solve using
exhaustive search

The placement solution space

Placing individual UPFC devices are not
independent tasks
There are complex non-linear
interactions between UPFC devices
The placement solution space is illbehaved, so traditional optimization
algorithms are not usable

Evolutionary Computing
The field of Evolutionary Computing (EC)
studies the theory and application of
Evolutionary Algorithms (EAs)
EAs can be described as a class of
stochastic, population-based optimization
algorithms inspired by natural evolution,
genetics, and population dynamics

Very high-level EA schematic




Intuitive view of why EAs work

Trial-and-error (aka generate-and-test)
Graduated solution quality creates
virtual gradient
Stochastic local search of solution


Problem Solving

Fitness Evaluation

Problem Specific
Black Box





Criteria Met?

Fitness Evaluation


(Darwinian) Evolution
The environment contains populations of
individuals of the same species which are
reproductively compatible
Natural selection
Random variation
Survival of the fittest
Inheritance of traits

(Mendelian) Genetics
Genotypes vs. phenotypes
Pleitropy: one gene affects multiple
phenotypic traits
Polygeny: one phenotypic trait is
affected by multiple genes
Chromosomes (haploid vs. diploid)
Loci and alleles

Nature versus the digital realm


Problem (solution space)

Fitness function

Genotype functional unit of inheritance
Individual functional unit of selection
Population functional unit of evolution

Solution Representation
Structural types: linear, tree, FSM, etc.
Data types: bit strings, integers,
permutations, reals, etc.
EA genotype encodes solution
representation and attributes
EA phenotype expresses the EA
genotype in the current environment
Encoding & Decoding

Fitness Function
Determines individuals fitness based
selection chances
Transforms objective function to linearly
ordered set with higher fitness values
corresponding to higher quality solutions
(i.e., solutions which better satisfy the
objective function)
Knapsack Problem Example


(Initial) population size

Uniform random
Heuristic based
Knowledge based
Genotypes from previous runs

Parent selection
Fitness Proportional Selection (FPS)
Roulette wheel sampling
High risk of premature convergence
Uneven selective pressure
Fitness function not transposition invariant

Fitness Rank Selection

Mapping function (like a cooling schedule)
Tournament selection

Variation operators
Mutation = Stochastic unary variation
Recombination = Stochastic multi-ary
variation operator

Bit-String Representation:
E[#flips] = L * pm

Integer Representation:
Random Reset (cardinal attributes)
Creep Mutation (ordinal attributes)

Mutation cont.
Non-uniform from fixed distribution
Gaussian, Cauche, Levy, etc.



Recombination rate: asexual vs. sexual

N-Point Crossover (positional bias)
Uniform Crossover (distributional bias)
Discrete recombination (no new alleles)
(Uniform) arithmetic recombination
Simple recombination
Single arithmetic recombination
Whole arithmetic recombination

Survivor selection
(+) plus strategy
(,) comma strategy (aka generational)
Typically fitness-based
Deterministic vs. stochastic

Alternatives include completely stochastic

and age-based


CPU time / wall time

Number of fitness evaluations
Lack of fitness improvement
Lack of genetic diversity
Solution quality / solution found
Combination of the above

Simple Genetic Algorithm (SGA)

Representation: Bit-strings
Recombination: 1-Point Crossover
Mutation: Bit Flip
Parent Selection: Fitness Proportional
Survival Selection: Generational

Problem solving steps

Collect problem knowledge (at minimum solution
representation and objective function)
Define gene representation and fitness function
Creation of initial population
Parent selection, mate pairing
Define variation operators
Survival selection
Define termination condition
Parameter tuning

Typical EA Strategy Parameters

Population size
Initialization related parameters
Selection related parameters
Number of offspring
Recombination chance
Mutation chance
Mutation rate
Termination related parameters

EA Pros
More general purpose than traditional
optimization algorithms; i.e., less problem
specific knowledge required
Ability to solve difficult problems
Solution availability
Inherent parallelism

EA Cons
Fitness function and genetic operators
often not obvious
Premature convergence
Computationally intensive
Difficult parameter optimization

Behavioral aspects
Exploration versus exploitation
Selective pressure
Population diversity
Fitness values

Premature convergence

Genetic Programming (GP)

Characteristic property: variable-size
hierarchical representation vs. fixed-size
linear in traditional EAs
Application domain: model optimization vs.
input values in traditional EAs
Unifying Paradigm: Program Induction

Program induction examples

Optimal control
Symbolic regression
Automatic programming
Discovering game playing strategies
Inverse problem solving
Decision Tree induction
Evolution of emergent behavior
Evolution of cellular automata

GP specification

Function set
Terminal set
Correct expressions
Closure property
Strongly typed GP

GP notes
Mutation or recombination (not both)
Bloat (survival of the fattest)
Parsimony pressure

Case Study employing GP

Deriving Gas-Phase Exposure

History through Computationally
Evolved Inverse Diffusion Analysis

Find Contaminants
and Fix Issues

Examine Indoor
Exposure History



Indoor air pollution top five

environmental health risks

$160 billion could be saved every year
by improving indoor air quality
Current exposure history is inadequate
A reliable method is needed to
determine past contamination levels and

Problem Statement

A forward diffusion differential

equation predicts concentration in

materials after exposure

An inverse diffusion equation finds the

timing and intensity of previous gas

Knowledge of early exposures would

greatly strengthen epidemiological

Concentration in solid

Concentration in gas

Gas-phase concentration history

material phase concentration profile

Elapsed time

x or distance into solid (m)

Proposed Solution
+ x^4
- tan(y) / pi
sin(x+y) + e^(x^2)
Use Genetic
5x^2 + 12x - 4
x^2 - sin(x)
Programming (GP)
as a directed
search for inverse

Fitness based on
forward equation

Related Research

It has been proven that the inverse

equation exists
Symbolic regression with GP has
successfully found both differential
equations and inverse functions
Similar inverse problems in
thermodynamics and geothermal
research have been solved

Interdisciplinary Work

Collaboration between Environmental

Engineering, Computer Science, and Math




Genetic Programming Algorithm



Genetic Programming Background


Y = X^2 + Sin( X * Pi )






Ability to characterize

exposure history will

enhance ability to assess
health risks of chemical

Parameter Tuning
A priori optimization of EA strategy
Start with stock parameter values
Manually adjust based on user intuition
Monte Carlo sampling of parameter
values on a few (short) runs
Meta-tuning algorithm (e.g., meta-EA)

Parameter Tuning drawbacks

Exhaustive search for optimal values of
parameters, even assuming independency, is
Parameter dependencies
Extremely time consuming
Optimal values are very problem specific
Different values may be optimal at different
evolutionary stages

Parameter Control
Example: replace pi with pi(t)
akin to cooling schedule in Simulated Annealing

Example: Rechenbergs 1/5 success rule

Example: mutation-step size control

Evaluation Function Control

Example 1: Parsimony Pressure in GP
Example 2: Penalty Functions in
Constraint Satisfaction Problems (aka
Constrained Optimization Problems)

Penalty Function Control

eval(x)=f(x)+W penalty(x)
Deterministic example:
W=W(t)=(C t) with C,1
Adaptive example
Self-adaptive example
Note: this allows evolution to cheat!

Parameter Control aspects

What is changed?
Parameters vs. operators

What evidence informs the change?

Absolute vs. relative

What is the scope of the change?

Gene vs. individual vs. population
Ex: one-bit allele for recombination operator
selection (pairwise vs. vote)

Parameter control examples

Representation (GP:ADFs, delta coding)

Evaluation function (objective function/)
Mutation (ES)
Recombination (Davis adaptive operator
fitness:implicit bucket brigade)
Selection (Boltzmann)

Self-Adaptive Mutation Control

Pioneered in Evolution Strategies
Now in widespread use in many types
of EAs

Uncorrelated mutation with one

Chromosomes: x1,,xn,
= exp( N(0,1))
xi = xi + N(0,1)
Typically the learning rate 1/ n
And we have a boundary rule < 0
= 0

Mutants with equal likelihood

Circle: mutants having same chance to be created

Uncorrelated mutation with n s

Chromosomes: x1,,xn, 1,, n
i = i exp( N(0,1) + Ni (0,1))
xi = xi + i Ni (0,1)
Two learning rate parmeters:
overall learning rate
coordinate wise learning rate

1/(2 n) and 1/(2 n)

and have individual proportionality constants which both
have default values of 1
i < 0 i = 0

Mutants with equal likelihood

Ellipse: mutants having the same chance to be created

Correlated mutations
Chromosomes: x1,,xn, 1,, n ,1,, k
where k = n (n-1)/2
and the covariance matrix C is defined as:
cii = i2
cij = 0 if i and j are not correlated
cij = ( i2 - j2 ) tan(2 ij) if i and j are correlated

Note the numbering / indices of the s

Correlated mutations contd

The mutation mechanism is then:
i = i exp( N(0,1) + Ni (0,1))
j = j + N (0,1)
x = x + N(0,C)
x stands for the vector x1,,xn
C is the covariance matrix C after mutation of the values

1/(2 n) and 1/(2 n) and 5

i < 0 i = 0 and
| j | > j = j - 2 sign(j)

Mutants with equal likelihood

Ellipse: mutants having the same chance to be created

Learning Classifier Systems (LCS)

Note: LCS is technically not a type of EA,
but can utilize an EA
Condition-Action Rule Based Systems
rule format: <condition:action>

Reinforcement Learning
LCS rule format:
<condition:action> predicted payoff
dont care symbols

LCS specifics
Multi-step credit allocation Bucket
Brigade algorithm
Rule Discovery Cycle EA
Pitt approach: each individual represents
a complete rule set
Michigan approach: each individual
represents a single rule, a population
represents the complete rule set

Multimodal Problems
Multimodal def.: multiple local optima and
at least one local optimum is not globally
Basins of attraction & Niches
Motivation for identifying a diverse set of
high quality solutions:
Allow for human judgement
Sharp peak niches may be overfitted

Restricted Mating
Panmictic vs. restricted mating
Finite pop size + panmictic mating -> genetic
Local Adaptation (environmental niche)
Punctuated Equilibria
Evolutionary Stasis

Speciation (end result of increasingly specialized

adaptation to particular environmental niches)

Implicit Diversity Maintenance (1)

Multiple runs of standard EA
Non-uniform basins of attraction problematic

Island Model (coarse-grain parallel)

Punctuated Equilibria
Epoch, migration
Communication characteristics
Initialization: number of islands and respective
population sizes

Implicit Diversity Maintenance (2)

Diffusion Model EAs
Single Population, Single Species
Overlapping demes distributed within
Algorithmic Space (e.g., grid)
Equivalent to cellular automata

Automatic Speciation
Genotype/phenotype mating restrictions

Explicit Diversity Maintenance

Fitness Sharing: individuals share
fitness within their niche
Crowding: replace similar parents

Multi-Objective EAs (MOEAs)

Extension of regular EA which maps multiple
objective values to single fitness value
Objectives typically conflict
In a standard EA, an individual A is said to be
better than an individual B if A has a higher
fitness value than B
In a MOEA, an individual A is said to be better
than an individual B if A dominates B

Domination in MOEAs
An individual A is said to dominate
individual B iff:
A is no worse than B in all objectives
A is strictly better than B in at least one

Pareto Optimality
Given a set of alternative allocations of, say,
goods or income for a set of individuals, a
movement from one allocation to another that
can make at least one individual better off
without making any other individual worse off is
called a Pareto Improvement. An allocation is
Pareto Optimal when no further Pareto
Improvements can be made. This is often
called a Strong Pareto Optimum (SPO).

Pareto Optimality in MOEAs

Among a set of solutions P, the nondominated subset of solutions P are
those that are not dominated by any
member of the set P
The non-dominated subset of the entire
feasible search space S is the globally
Pareto-optimal set

Goals of MOEAs
Identify the Global Pareto-Optimal set of
solutions (aka the Pareto Optimal Front)
Find a sufficient coverage of that set
Find an even distribution of solutions

MOEA metrics
Convergence: How close is a generated
solution set to the true Pareto-optimal
Diversity: Are the generated solutions
evenly distributed, or are they in clusters

Deterioration in MOEAs
Competition can result in the loss of a
non-dominated solution which
dominated a previously generated
This loss in its turn can result in the
previously generated solution being
regenerated and surviving

Game-Theoretic Problems
Adversarial search: multi-agent problem with
conflicting utility functions
Ultimatum Game
Select two subjects, A and B
Subject A gets 10 units of currency
A has to make an offer (ultimatum) to B, anywhere from
0 to 10 of his units
B has the option to accept or reject (no negotiation)
If B accepts, A keeps the remaining units and B the
offered units; otherwise they both loose all units

Real-World Game-Theoretic
Real-world examples:
economic & military strategy
arms control
cyber security

Common problem: real-world games

are typically incomputable

Military armsraces
Prisoners Dilemma
Biological armsraces

Approximating incomputable games

Consider the space of each users actions
Perform local search in these spaces
Solution quality in one space is dependent
on the search in the other spaces
The simultaneous search of co-dependent
spaces is naturally modeled as an

Evolutionary armsraces
Iterated evolutionary armsraces
Biological armsraces revisited
Iterated armsrace optimization is

Coevolutionary Algorithm (CoEA)

A special type of EAs where the fitness of
an individual is dependent on other
individuals. (i.e., individuals are
explicitly part of the environment)
Single species vs. multiple species
Cooperative vs. competitive coevolution

CoEA difficulties (1)

Occurs when one population evolves so
much faster than the other that all
individuals of the other are utterly
defeated, making it impossible to
differentiate between better and worse
individuals without which there can be no

CoEA difficulties (2)

Occurs when populations have lost the
genetic knowledge of how to defeat an
earlier generation adversary and that
adversary re-evolves
Potentially this can cause an infinite loop
in which the populations continue to
evolve but do not improve

CoEA difficulties (3)

Suboptimal Equilibrium
(aka Mediocre Stability)
Occurs when the system stabilizes in a
suboptimal equilibrium

Case Study from Critical

Infrastructure Protection
Infrastructure Hardening
Hardenings (defenders) versus
contingencies (attackers)
Hardenings need to balance spare flow
capacity with flow control

Case study from Automated

Software Engineering

Coevolutionary Automated
Software Correction (CASC)

Objective: Find a way to automate the

process of software testing and correction.

Approach: Create Coevolutionary

Automated Software Correction (CASC)
system which will take a software artifact as
input and produce a corrected version of
the software artifact as output.

Coevolutionary Cycle

Population Initialization

Population Initialization

Population Initialization

Population Initialization

Initial Evaluation

Initial Evaluation

Reproduction Phase

Reproduction Phase

Reproduction Phase

Evaluation Phase

Evaluation Phase

Competition Phase

Competition Phase