Académique Documents
Professionnel Documents
Culture Documents
Evolutionary Computing
Part I: Introduction to
Evolutionary Algorithms
Dr. Daniel Tauritz
Director, Natural Computation Laboratory
Associate Professor, Department of Computer Science
Research Investigator, Intelligent Systems Center
Collaborator, Energy Research & Development Center
Motivation
Real-world optimization problems are
typically characterized by huge, ill-behaved
solution spaces
Infeasible to exhaustively search
Defy traditional (gradient-based) optimization
algorithms because they are non-linear, nondifferentiable, non-continuous, or non-convex
Real-World Example
Electric Power Transmission Systems
Supply is not keeping up with demand
Expansion hampered by:
Social, environmental, and economic
constraints
Transmission system is stressed
Already carrying more than intended
Dramatic increase in incidence reports
The Grid
Failure Analysis
Failure spreads relatively quickly
Too quickly for conventional control
Cascade may be avoidable
Utilize unused capacities (flow
compensation)
Unsatisfiable condition may be avoidable
Better power flow control to reduce severity
Possible Solution
Strategically place a number of power
flow control devices
Flexible A/C Transmission System
(FACTS) devices are a promising type
of high-speed power-electronics power
flow control devices
Unified Power Flow Controller (UPFC)
Simulation
Engine
HIL Line
Evolutionary Computing
The field of Evolutionary Computing (EC)
studies the theory and application of
Evolutionary Algorithms (EAs)
EAs can be described as a class of
stochastic, population-based optimization
algorithms inspired by natural evolution,
genetics, and population dynamics
EA
EAparameters
solution
Problem
Description
Evolutionary
Problem Solving
Population
Initialization
Strategy
Parameters
Fitness Evaluation
Problem Specific
Black Box
Reproduction
Evolutionary
Cycle
Competition
no
Termination
Criteria Met?
yes
Fitness Evaluation
Solution
(Darwinian) Evolution
The environment contains populations of
individuals of the same species which are
reproductively compatible
Natural selection
Random variation
Survival of the fittest
Inheritance of traits
(Mendelian) Genetics
Genotypes vs. phenotypes
Pleitropy: one gene affects multiple
phenotypic traits
Polygeny: one phenotypic trait is
affected by multiple genes
Chromosomes (haploid vs. diploid)
Loci and alleles
Scope
Genotype functional unit of inheritance
Individual functional unit of selection
Population functional unit of evolution
Solution Representation
Structural types: linear, tree, FSM, etc.
Data types: bit strings, integers,
permutations, reals, etc.
EA genotype encodes solution
representation and attributes
EA phenotype expresses the EA
genotype in the current environment
Encoding & Decoding
Fitness Function
Determines individuals fitness based
selection chances
Transforms objective function to linearly
ordered set with higher fitness values
corresponding to higher quality solutions
(i.e., solutions which better satisfy the
objective function)
Knapsack Problem Example
Initialization
Parent selection
Fitness Proportional Selection (FPS)
Roulette wheel sampling
High risk of premature convergence
Uneven selective pressure
Fitness function not transposition invariant
Variation operators
Mutation = Stochastic unary variation
operator
Recombination = Stochastic multi-ary
variation operator
Mutation
Bit-String Representation:
Bit-Flip
E[#flips] = L * pm
Integer Representation:
Random Reset (cardinal attributes)
Creep Mutation (ordinal attributes)
Mutation cont.
Floating-Point
Uniform
Non-uniform from fixed distribution
Gaussian, Cauche, Levy, etc.
Permutation
Swap
Insert
Scramble
Inversion
Recombination
Survivor selection
(+) plus strategy
(,) comma strategy (aka generational)
Typically fitness-based
Deterministic vs. stochastic
Truncation
Elitism
Termination
Representation: Bit-strings
Recombination: 1-Point Crossover
Mutation: Bit Flip
Parent Selection: Fitness Proportional
Survival Selection: Generational
Population size
Initialization related parameters
Selection related parameters
Number of offspring
Recombination chance
Mutation chance
Mutation rate
Termination related parameters
EA Pros
More general purpose than traditional
optimization algorithms; i.e., less problem
specific knowledge required
Ability to solve difficult problems
Solution availability
Robustness
Inherent parallelism
EA Cons
Fitness function and genetic operators
often not obvious
Premature convergence
Computationally intensive
Difficult parameter optimization
Behavioral aspects
Exploration versus exploitation
Selective pressure
Population diversity
Fitness values
Phenotypes
Genotypes
Alleles
Premature convergence
Optimal control
Planning
Symbolic regression
Automatic programming
Discovering game playing strategies
Forecasting
Inverse problem solving
Decision Tree induction
Evolution of emergent behavior
Evolution of cellular automata
GP specification
S-expressions
Function set
Terminal set
Arity
Correct expressions
Closure property
Strongly typed GP
GP notes
Mutation or recombination (not both)
Bloat (survival of the fattest)
Parsimony pressure
Introduction
Find Contaminants
and Fix Issues
Examine Indoor
Exposure History
Unexplained
Sickness
Background
Problem Statement
Concentration in solid
Concentration in gas
0
Elapsed time
0
x or distance into solid (m)
Proposed Solution
x^5x^2
+ x^4
- tan(y) / pi
+
sin(x)
sin(cos(x+y)^2)
sin(x+y) + e^(x^2)
Use Genetic
5x^2 + 12x - 4
x^2 - sin(x)
Programming (GP)
X+
as a directed
Sin
search for inverse
equation
/
Fitness based on
forward equation
Related Research
equation exists
Symbolic regression with GP has
successfully found both differential
equations and inverse functions
Similar inverse problems in
thermodynamics and geothermal
research have been solved
Interdisciplinary Work
Candidate
Solutions
Competitio
n
Reproductio
n
Population
Fitness
Forward
Diffusion
Equation
Y = X^2 + Sin( X * Pi )
*
X
Sin
*
X
Pi
Summary
Ability to characterize
Parameter Tuning
A priori optimization of EA strategy
parameters
Start with stock parameter values
Manually adjust based on user intuition
Monte Carlo sampling of parameter
values on a few (short) runs
Meta-tuning algorithm (e.g., meta-EA)
Parameter Control
Blind
Example: replace pi with pi(t)
akin to cooling schedule in Simulated Annealing
Adaptive
Example: Rechenbergs 1/5 success rule
Self-adaptive
Example: mutation-step size control
Correlated mutations
Chromosomes: x1,,xn, 1,, n ,1,, k
where k = n (n-1)/2
and the covariance matrix C is defined as:
cii = i2
cij = 0 if i and j are not correlated
cij = ( i2 - j2 ) tan(2 ij) if i and j are correlated
Reinforcement Learning
LCS rule format:
<condition:action> predicted payoff
dont care symbols
LCS specifics
Multi-step credit allocation Bucket
Brigade algorithm
Rule Discovery Cycle EA
Pitt approach: each individual represents
a complete rule set
Michigan approach: each individual
represents a single rule, a population
represents the complete rule set
Multimodal Problems
Multimodal def.: multiple local optima and
at least one local optimum is not globally
optimal
Basins of attraction & Niches
Motivation for identifying a diverse set of
high quality solutions:
Allow for human judgement
Sharp peak niches may be overfitted
Restricted Mating
Panmictic vs. restricted mating
Finite pop size + panmictic mating -> genetic
drift
Local Adaptation (environmental niche)
Punctuated Equilibria
Evolutionary Stasis
Demes
Automatic Speciation
Genotype/phenotype mating restrictions
Domination in MOEAs
An individual A is said to dominate
individual B iff:
A is no worse than B in all objectives
A is strictly better than B in at least one
objective
Pareto Optimality
Given a set of alternative allocations of, say,
goods or income for a set of individuals, a
movement from one allocation to another that
can make at least one individual better off
without making any other individual worse off is
called a Pareto Improvement. An allocation is
Pareto Optimal when no further Pareto
Improvements can be made. This is often
called a Strong Pareto Optimum (SPO).
Goals of MOEAs
Identify the Global Pareto-Optimal set of
solutions (aka the Pareto Optimal Front)
Find a sufficient coverage of that set
Find an even distribution of solutions
MOEA metrics
Convergence: How close is a generated
solution set to the true Pareto-optimal
front
Diversity: Are the generated solutions
evenly distributed, or are they in clusters
Deterioration in MOEAs
Competition can result in the loss of a
non-dominated solution which
dominated a previously generated
solution
This loss in its turn can result in the
previously generated solution being
regenerated and surviving
Game-Theoretic Problems
Adversarial search: multi-agent problem with
conflicting utility functions
Ultimatum Game
Select two subjects, A and B
Subject A gets 10 units of currency
A has to make an offer (ultimatum) to B, anywhere from
0 to 10 of his units
B has the option to accept or reject (no negotiation)
If B accepts, A keeps the remaining units and B the
offered units; otherwise they both loose all units
Real-World Game-Theoretic
Problems
Real-world examples:
economic & military strategy
arms control
cyber security
bargaining
Armsraces
Military armsraces
Prisoners Dilemma
Biological armsraces
Evolutionary armsraces
Iterated evolutionary armsraces
Biological armsraces revisited
Iterated armsrace optimization is
doomed!
Coevolutionary Automated
Software Correction (CASC)
Coevolutionary Cycle
Population Initialization
Population Initialization
Population Initialization
Population Initialization
Initial Evaluation
Initial Evaluation
Reproduction Phase
Reproduction Phase
Reproduction Phase
Evaluation Phase
Evaluation Phase
Competition Phase
Competition Phase
Termination
Termination