Vous êtes sur la page 1sur 10

1

Grenoble INP ENSE3, 11 September 2014

Optimization problems and algorithms (6 hours)


_______
1. Aim of this practical work
The aim of this practical work is to study the behavior and characteristics of some
optimization algorithms on academic optimization problems. The use of these
algorithms on a realistic problem will be another practical work. For this study we
use the software GOT-It, co-developed by CEDRAT and G2Elab (Electrical
Engineering Laboratory of Grenoble).
The text of this subject begins with two introductory sections:
the definition of three optimization problems that will support studies,
a quick overview of the optimization software to be used,
It is followed by four sections explaining the studies to be undertaken:
an experimentation on the unimodal problem Rosenbrock,
an experimentation on the multimodal problem Belledonne,
an experimentation on the multiobjective problem BiSphere,
a study of robustness of solutions on problem Belledonne.

2. Optimization problems studied


A n-dimensional optimization problem can be written in general form as:
minimize
f ( x ) = f ( x1 , x2 ,..., xn ) with x = {x1 , x2 ,..., xn } n
respecting
Gi (x ) = 0
i = 1,..., me
Gi (x ) 0
i = me + 1,..., m
x min
x j x max
j = 1,..., n
j
j

Where
The quantity f(x) is a criterion to be minimized, also called objective
function. There may be several criteria.
The vector x consists of n variables xj representing the parameters of the
problem.
The functions Gi(x) represent the constraints of equality and inequality.
The values x min
and x max
are the domain constraints.
j
j
Example 1: The unimodal problem Rosenbrock
Minimize f1 ( x1 , x 2 ) = 100 ( x 2 x12 ) 2 + ( x1 1) 2
with 2 x1 +2
2 x 2 +2
This problem has a single optimum f1 (1,1) = 0 .

Grenoble INP ENSE3, 11 September 2014

Example 2: The multimodal problem Belledonne


Minimize f 2 ( y1 , y2 ) = (1 - y12 )cos(5 y1 ) + (12 - y2 2 )cos( 4 )
y 2 + 0.3

with
0 y1 2
0 y2 2
This problem has one global minimum f 2 (1.36864,0.01548) = -12.73956 and three local
minima f 2 (0.56434,0.01548) = -12.64671 , f 2 (1.36864,0.86648) = -11.92323 f 2 (0.56434,0.86648) = -11.83038 .
Example 3: The multiobjective problem BiSphere
Minimize simultaneously f 31 ( z1 , z 2 ) = ( z1 1) 2 + z 2 2
and f 32 ( z1 , z 2 ) = z12 + ( z 2 1) 2
with
5 z1 +5
5 z 2 +5
The two partial solutions f 31 (1,0) = 0 and f 32 (0,1) = 0 cannot be reached simultaneously.
It is necessary to consider a compromise between the two objectives. There are an
infinity of possible compromises.
3. Quick overview of the optimization software GOT-It.
GOT-It gives the possibility of describing an optimization problem, selecting an
optimization algorithm, running an optimization and analyzing solutions.
On the network ENSE3, the version 2 of the software is available; however we will
use a version-2 beta, more recent, available as follows
Connect an "O:" drive to \\FSRV1\Pedagogie\Dist-JLC
Open a window on "O:"
Start the more recent got_dist\gotit_exec_GOT_Vyyyy-mm-dd.bat
(double click)
If the software communicates in French, switch to English (tools ->
Preferences ... -> ENGLISH then quit and re-launch the software)
Create a new project (Project > Create new project ...).
Description of an optimization problem:
The command Parameter is used to describe each parameter (x1,x2,...) its
name, its value (used as the initial value of some optimization algorithms),
its minimum and maximum values, its rounding grid and unit. It is possible
also to fix a parameter or to indicate if it is uncertain.
The command Function is used to describe auxiliary functions.
The command Optimization function is used to describe the names and
expressions of goals to be achieved (f1, f2, ) and constraints (g1, g2, ).
The command Optimization problem is used to describe the name of an
optimization problem, its goals and constraints.

Grenoble INP ENSE3, 11 September 2014

Selecting an optimizer:
The command Optimization algorithm is used to select an optimizer able
to solve an optimization problem.
Note: An optimizer is the most efficient in terms of precision and cost (number of evaluations) when used to solve the
type of optimization problem for which it was designed (monoobjective / multiobjective, unconstrained / constrained).

Optimization:
The command Optimization starts the optimization of an optimization
problem using an optimizer.
Analysis:
The command Analysis tool is used to analyze the solutions and also to
plot the functions.
4. Experimentation on the problem of Rosenbrock
In this first experimentation, we use the problem of Rosenbrock (see on internet) to
highlight the operation of optimization algorithms available to us.
- Create a new project (associate to it the new file Rosenbrock.gotit, for
instance).
- Describe the Continuous parameters x1 and x2 (take 1/10000 as rounding
grid for x1 x2, do not define a measure unit) and the Rosenbrocks function f1
(name it F1) (the power operator is **, for instance ( x1 1) 2 is coded (x1-1)**2
).
- Analyze F1 (try Curve plotting, Evaluator, Isoval plotting, Surface
plotting available in Analysis tool).
- Switch from the context Parametric to the context Optimization.
- In the menu Tuning, check Clear cache.
- In Optimization function create OF1, the Minimization goal, with F1 for
function and 5E-5 for target (F1 should be less than or equal to 5E-5).
- Describe the Optimization problem (name it Rosenbrock) whose single goal is
OF1.
- Select an optimization algorithm SQP (name it SQP).
- Create and execute the Optimization (name it O1) of problem Rosenbrock with
SQP (experiment with the initial values [2, 2], [0, 0], [2, -2], [-2, -2], [-2, 2] for
x1 x2). If necessary, increase the maximum number of iterations. Note the
numbers of iterations of the algorithm and the number of evaluations of the
function. Explain the difference between iterations and evaluations?
Trick 1: In the data tree, right-click O1 then Analyze shows information on the results achieved.
Trick 2: An optimization already done can be re-executed after the changes (among others) of parameters. To do this in
the data tree, right-click O1 then Execute.

Grenoble INP ENSE3, 11 September 2014

Trick 3: The progress of the optimization can be re-viewed. To do this in the graphics window of O1 Execute, select
the panel [x2] (x1), right click on the picture and then Replay.
Trick 4: In the graphics window of O1 Execute, you can add or remove some progress panels (right click on the
panel background or on any tab, for example [x2] (x1)).
Trick 5: The selection in the data tree of several objects (e.g.: the parameters, the optimization algorithm, the
optimization ...) followed by Edit, begins simultaneous editing in a table. The Execute button, allows re-run (after
Validate, if any object has been modified) without leaving simultaneous editing.
Note on the target of an optimization function: The target of an optimization function is used to indicate, to the
optimization algorithm, a value to reach. If no target is set, the algorithm looks for the smallest possible value in the
case of a minimization or the largest possible value in the case of maximization. In situations where the optimization
function is expensive, it is usual to set a target relatively modest, in order to limit the cost of optimization. In the case of
a test function whose optimum is known, it is convenient to set a target because the number of evaluations needed to
reach the target is then a measure of performance of the algorithm.
Note on the performance of an algorithm: The performance of an optimization algorithm is measured by the number
of evaluations of the objective functions and constraints necessary to obtain an acceptable solution. In the case of a
realistic use, each evaluation may take several minutes, several hours or even several days of computation! To reduce
the computation time, the GOT-It software is equipped with a cache of limited capacity, to keep the most recent
evaluations obtained during a session. However, when it comes to compare the performance of two optimization
algorithms, the cache is a disturbing factor. For comparative studies on the number of evaluations required for
optimization algorithms regardless of their order of execution in a session, it is advisable to request automatic cache
reset (menu Tuning Check Clear Cache). To estimate the benefit of the cache on the algorithm (this is not required
here), it should be off completely (menu Tuning - uncheck Enable cache). When the box Clear cache is checked, an
additional panel showing the number of function evaluations appears in the graphics window at the end of the
optimization.

Make a second series of optimizations, but this time choosing a genetic algorithm
(name it GA). Here it is not necessary to change the initial values of x1 x2, their role
is limited to the creation of a single individual of the initial population, the course of
this stochastic algorithm depends little. The operation of the GA optimizer depends
on the probabilities associated with crossover and mutation stochastic operators
(they can be edited in expert context) and on the size of the population. The number
of generations must be large enough not to lead to a premature stop.
The sequence also depends on pseudo-random drawings made. At population size
and probabilities of stochastic operators set, it is necessary, to estimate the
performance of a stochastic algorithm, to perform several optimizations by changing
the seed of the pseudo-random number generator. To do this, in editing mode, delete
the seed generator (in the editor see the Options tab), execute and note each time the
numbers of evaluations of OF1 necessary to achieve the target. Recall, in the Tuning
menu, the Clear cache box must be checked to ensure that the numbers of
evaluations are shown.
Conclude on the advantages and disadvantages of the two algorithms, particularly in
terms of number of calls (of evaluations) to the objective function, in terms of
accuracy of the final solution and requirement of usability.
Trick: The Logbook keeps in the form of sheets, a record of operations carried out. A context menu associated with
each record, allows you to edit or delete a comment. It is recommended, as and when they are created, to close
unneeded sheets and comment those retained.

Grenoble INP ENSE3, 11 September 2014

Save the project and its logbook (the file of the logbook is a HTML document
available for consultation with a browser) and exit the software.
5. Experimentation on the problem Belledonne
In this section, we will explore the specificities of a multimodal problem named
Belledonne (this is the name of a mountain surrounding Grenoble in France).
Open a new project (associate it with a new file named Belledonne.gotit, for
instance).
Plot the Belledonne function (the cosine and square root functions are encoded Cos
and Sqrt) (note: for correctly visualizing F2 with "Surface plotting" take at least 101
points for Y1 and Y2).
Perform some optimizations with SQP and GA (take to 1/100000 as grid size for y1
y2) (take -12.73953 as target of the optimization function OF2).
Note the average cost of GA in its search for the global optimum, on a significant
number of repetitions (at least 20).
Trick: In the context Optimization enhanced and in order to automate the tests, create and use a Repeating optimizer.
This optimizer repeats the executions of its internal optimizer (e.g. GA) as many times as needed (e.g. 20) and returns
all results.

Here, what are the strengths and weaknesses of the deterministic algorithm? Same
question for the stochastic algorithm?
Select a niching algorithm (name it Niching) and run an optimization with this
algorithm. Study the solutions returned by the algorithm. What is the main difference
between this algorithm and a classical genetic algorithm and what is the interest?
In conclusion, what algorithm is best suited to solve the problem Belledonne?
Save the project and logbook and exit of the software.
6. Experimentation on the problem BiSphere
In this section, we will explore the specificities of a multiobjective problem.
Open a new project (associate to it the new file BiSphere.gotit, for instance).
In the software, describe the parameters and objectives F31 and F32.

Grenoble INP ENSE3, 11 September 2014

Usual resolution with the help of a mono-objective optimizer:


The usual method is to solve a single objective problem (a compromise between the
two objectives):
Minimize

f 3 ( z1 , z 2 ) =

(1 u )
(1 + u )
f 31 +
f 32
2
2

with

1 u +1

The quantity u is used to weight the two objectives: when u=-1, the function f3 is
reduced to f31, when u=+1, it is reduced to f32, when u=0, f3 is the average, etc.
In the software introduce a constant function u=0 (a function, not a parameter, to
avoid confusion while optimizing), then build the F3 function above.
With an optimizer SQP, determine the minimum of F3 and note the values z1, z2, f3,
f31 and f32. Do this successively for u= -1, -0.5, 0, +0.5, 1.
Place (Figure freehand) all solutions z1 z2 in the space of variables. Observation?
Was it predictable (analytically determine z1(u), z2(u) with 4 lines of calculation)?
Place (Figure freehand) all solutions f31 et f32 in the space of objectives.
Solving with the help of a multiobjective optimizer:
Now build a multiobjective problem named BiSphere. Take a multi-goal problem
with two objectives (without setting a target) and unconstrained.
Select a multiobjective optimizer GMGA (name it GMGA).
Start optimization with BiSphere GMGA.
Explain the concept of dominated solution, non-dominated solution and Pareto
frontier. Between dominated solution and not dominated solution which is the most
interesting? Deduce the operating principle of GMGA.
After optimization, plot the Pareto front (F32, F31), using the operating Analysis
tool "Pareto frontier".
Compare the two methods of resolution especially when the target number is greater
than 2.
Save the project and its logbook and exit of the software.

Grenoble INP ENSE3, 11 September 2014

7. Robustness of solutions to the problem of Belledonne


A product is said to be robust if its response is not altered by disturbance. An
optimized product that only works with particularly sharp requirements will not be
robust.
It is possible to define four design approaches that give solutions with different
characteristics (dispersion and performance). These four approaches are:
- Nominal Solution: We seek a solution that meets the specifications (this
process can be implemented by trial and error modification of a small number
of design parameters). We do not seek to maximize the performance. This step
can optionally be followed by a sensitivity analysis.
- Sensitivity analysis: We study the impact of disturbances on performance for a
given solution.
- Optimized Solution (optimizing performance): We are looking, among several
solutions, one that will maximize the performance criteria: We alternate design
phase (varying design parameters) and evaluation phase (assessing
performance criteria).
- Robust solution (maximizing robustness and performance): We are looking,
among several solutions, one that is both optimized for performance and
robustness. For this, it is necessary to alternate phases of design and evaluation
of performance and robustness. We then have to make a compromise between
robustness and performance.
In this section, we first discover the tools at our disposal to investigate the sensitivity
and robustness of a solution (Variation plotting, Stochastic evaluator, Interval
evaluator, Worst case deviation, Probabilistic deviation) then we will use them to
determine a robust solution for the problem Belledonne.
a) Discovering the tools
It is important to remember that there is a grid of rounding associated with each
parameter. The only allowed values for parameters optimization are taken on the
grid. In an industrial context, this grid could be an image of the values achievable in
production for this parameter. In this context, more the grid will be thinner; more the
realization will be difficult and expensive.
Reload the project Belledonne.gotit and impose 1/1000 as grid sizes for all
parameters. This grid spacing, coarser than in the initial study, will better highlight
the concepts of robustness of an operating point.
Specify also than all the parameters are uncertain (toggle the corresponding
options).

Grenoble INP ENSE3, 11 September 2014

Optimize once for all with the niching algorithm that provides four solutions (index
0, 1, 2 and 3).
Variation plotting:
- Launch the tool Variation plotting on the function F2, focusing the analysis on
the solution with index 0 (see trick below) and taking a large number of points
(for instance 101). What kind of information this graph provides? What
information it does not provide?
Stochastic evaluator:
- Launch a Stochastic evaluator on the function F2, focusing the analysis on the
solution with index 0 (see trick below) and taking a large number of trials.
Note the results. Please provide a brief explanation of the stochastic evaluation
mechanism (the keyword for an internet research is "Monte Carlo method").
What is the purpose of this tool in a study of robustness?
Note: in this analysis, the steps of the rounding grid allow to set the standard deviations of the parameters around
the analyzed solution (standard deviation = step).

Interval evaluator:
- Launch an Evaluator on function F2, focusing the analysis on the solution with
index 0 (see trick below) and requesting interval analysis (flag to configure).
Note the results. Explain (ten lines) the principle of arithmetic of intervals and
its limits. What is the interest of this tool in a study of robustness?
Findings: a) The optimization gives F2(1.368,0.016)=-12.727; b) The
minimum (see page 2) is -12.739 ; c) The interval analysis in the rectangle
[1.362 : 1.374][0.010 : 0.022] gives [-12.767:-10.780] for F2.
Questions: a) Why the optimization does not give -12.739? b) What
interpretation to give to an interval [-12.767:-10.780] so important? c) In a real
problem, the minimum is not known in advance. What might suggest the
"-12.767" of the interval analysis?
Note: in this analysis, the steps of the rounding grid allow to set the deviation sizes of the parameters around the
analyzed solution (interval size = 3 * step).

Function "Worst case deviation":


-

Create (in Function) a Worst case deviation -> Max deviation named D2
dedicated to the perturbation of F2. Evaluate D2 (with an Evaluator) for the
solution of index 0. Note the results. Explain the principle of this rather
pragmatic estimation; especially clarify the hypothesis done on F2. What are
the advantages and disadvantages of this tool (especially in relation to the
interval analysis)?

Grenoble INP ENSE3, 11 September 2014

Note: in this function, the steps of the rounding grid allow to set the maximal deviations of the parameters
around the analyzed solution (variation magnitude = 3 * step) (internet search "689599.7 rule" or "threesigma rule").

Function "Probabilistic deviation":


- Create (in Function) a Probabilistic deviation -> Standard deviation named
S2 dedicated to the standard deviation of F2. Evaluate S2 (with an Evaluator)
for the solution of index 0. Note the results. The estimate of the standard
deviation thus obtained is relatively inexpensive (2*N+1=5 deterministic
evaluations of function F2 around the solution). Compare with the Monte
Carlo method.
Note: in this function, the steps of the rounding grid allow to set the standard deviations of the parameters
around the analyzed solution (standard deviation = step).
Trick: To center an analysis on a solution with the tools of this section, it is not necessary to redefine the parameter
values each time. Just optimize one time with the niching algorithm that provides four solutions and, in any of the
analysis tools, complete the box "Optimization giving the reference point" and the box "Index of the solution in the
optimization" (here the index can be 0, 1, 2 or 3 as there are 4 solutions) (the solutions are ordered according to the
values of the objective, the index of global minimum is 0), then validate.

b) Determining a robust solution to the problem of Belledonne:


Experiment three approaches to find a robust optimum.
i) Approach "robustness analysis a posteriori" manual:
In this approach, the optima are determined as usual and then compared manually.
Therefore compare the four optima of the problem obtained through the multimodal
optimization (niching) using at least two of the previous analysis tools (Variation
plotting, Stochastic evaluator, Interval evaluator, Worst case deviation,
Probabilistic deviation). What is the most robust solution?
ii) Approach "robustness analysis a posteriori" aided:
In this approach, the optima are determined as usual and then compared by means of
Post functions. To do that, create a new optimization dedicated to the problem
Belledonne, with a niching algorithm, equipped with F2 and D2 or S2 as post
functions, limiting the number of solutions to the useful quantity (Max solutions =
4), without limitation on the post processing (Max post solutions = -1). Run this new
optimization and analyze the results. What is the most robust solution?
iii) Approach "direct robust optimization":
In this approach, a robust optimum is found directly. Imagine and implement a
direct formulation of the problem that incorporates both the minimization of F2 and
its robustness.

10

Grenoble INP ENSE3, 11 September 2014

Save the project and its logbook and exit of the software.
8. Conclusion
Make a synthesis on the strengths, weaknesses and areas of use of experienced
algorithms.

Vous aimerez peut-être aussi