Académique Documents
Professionnel Documents
Culture Documents
Methods
Alexandr tefek
*
*
University of Defence, Faculty of Military Technology, Brno, Czech Republic, e-mail: alexandr.stefek@unob.cz
Abstract It is short history when many heuristic
optimization methods appear. As example Particle swarm
optimization method (PSO) or Repulsive particle swarm
optimization method (RPSO), Gravitational search
algorithm (GSA), Central force optimization (CFO),
Harmony search algorithm (HAS) etc. Those methods are
working differently but all of them can optimize same
problems. There is general question: Exists any standard
benchmark which can be used for individual methods
comparing? It is a bit hard to answer this question because
it is possible to find some optimization problems which are
widely used along some papers but in fact there does not
exists summary which can be uses for standard evaluation
of optimization process.
This paper will describe set of benchmarks which can be
used for evaluation of optimization process. As a part of
paper some results of optimization and utilization of
optimization methods will be included.
Keywords Heuristic, Optimization, Benchamarks.
I. INTRODUCTION
The optimization problem can be defined as:
xeR
n
,
y = (x), (1)
yeR.
Basically the problems can be divided into unimodal
and multimodal. Another aspect are constrains. Some
problems have not any constrain. The best optimization
method has to find optimal value in all cases.
Unfortunately this is, in current state of art, impossible.
Still it will be useful to be able compare different
optimization methods.
g(x) u, (2)
b(x) = u.
If the problem has defined constrain b(x) = u it can be
extended to
b(x) + z u,
z u, (S)
y = (x) + cz.
where c is coefficient which have force solver to find
optimum z = u.
There exist many algorithms for problem solving. Some
of them are special, some are more general. Many
problems cannot be solved by deterministic algorithm so
heuristic algorithm are used. In the set of heuristic
algorithm can be found PSO [1], GSA [2], CFO [3],
HAS[4]. New methods are compared to old known
solutions but there is not defined standard for solvers
(algorithms) testing.
At current state there is not possibility to reconstruct
presented solution. Very often there is not possible to use
already computed results because there does not exist
standards for measuring.
This standard measuring benchmarks can be
defined via functions. To measure different
algorithm/solvers on different problems standard problems
and interfaces to those problems must be defined.
II. UNIMODAL PROBLEMS
Unimodal functions have only one local optimum.
Those functions are relatively easy to analyze for
optimums. They are used for checking the speed of
optimization and convergence. There are used commonly
two functions. First one is Sphere and second one is Sum
and Product.
A. F1 sphere
Sphere function in fact is not sphere but the projection
to R
2
subspace will create circles around optimal solution.
This function is generalized to R
n
(x) = x
2 n
=1
. (4)
Optimum is x
= u, (x) = u
[X,Y] = meshgrid(-100:5:100,-100:5:100);
Z = X.^2 + Y.^2;
Code 1. Sphere function in 2D
Code 1 is presenting the code for use in Matlab. The R
2
version of function is show on figures below.
Fig. 1. Sphere function in 2D
8
0
0
0
8000 8000
8
0
0
0
8
0
0
0
8000
8
0
00
8
0
0
0
4
0
0
0
4000
4
0
0
0
4
0
0
0
4000
4
0
0
0
1
2
0
0
0
1
2
0
0
0
1
2
0
0
0
1
2
0
0
0
1
6
0
0
0
1
6
0
0
0
1
6
0
0
0
1
6
0
0
0
-100 -80 -60 -40 -20 0 20 40 60 80 100
-100
-80
-60
-40
-20
0
20
40
60
80
100
Fig. 2. Sphere function in 3D
B. F2 sum and product
Sum and Product function is simply to evaluate. This
function has same role as Sphere function but the different
shape.
(x) = |x
|
n
=1
+ |x
|
n
=1
. (S)
Optimum is x
= u, (x) = u
[X,Y] = meshgrid(-10:0.5:10,-10:0.5:10);
Z = abs(X).*abs(Y) + abs(X) + abs(Y);
Code 2. Sum and Product function in 2D
Fig. 3. Sum and Product function in 2D
Fig. 4. Sum and Product function in 3D
III. MULTIMODAL PROBLEMS
Multimodal functions have multiple local optimums.
Some methods will stuck in local optimum. Main goal of
those problems is to test solvers how they are able to
avoid local optimum. Some problems have no single
global optimum. Some of them have one global optimum
and many local which are very close in the term of fitness
function.
A. Schefels function
Schefels function is very often used problem for
solvers benchmarking. This problem is one of most
difficult. This problem is a bit different from others
because in some point of view it is constrained. Presented
optimum (see below) is not global optimum. This function
has no global optimum.
(x) = - x
sin|x
|
n
=1
. (6)
Commonly declared optimum is x
= 42u.96874S,
(x) = 418.9829n at interval x
e (-Suu; Suu).
[X,Y] = meshgrid(-500:10:500,-500:10:500);
Z = -X.*sin(sqrt(abs(X))) -
Y.*sin(sqrt(abs(Y)));
Code 3. Schefels function
Fig. 5. Schefels function in 2D
Fig. 6. Schefels function in 3D
4
0
4
0
4
0
40
4
0
40
4
0
4
0
8
0
8
0
8
0
8
0
-10 -8 -6 -4 -2 0 2 4 6 8 10
-10
-8
-6
-4
-2
0
2
4
6
8
10
8
0
0
80
0
8
0
0
8
0
0
8
0
0
800
8
0
0 800
8
0
0
8
0
0
8
0
0
800
800
8
0
0
800
8
0
0
8
0
0
8
0
0
8
0
0
800
800
8
0
0
800
8
0
0
80
0
8
0
0
8
0
0
800
800
8
0
0
8
0
0
1
2
00
400
1200
1
2
0
0
400
400
12
00
1200
800
8
0
0
1
2
0
0
1200
1
2
0
0
1
2
0
0
1
2
0
0
4
0
0
400
4
0
0
1200
1
2
0
0
4
0
0
4
0
0
1600
400
4
0
0
1
2
0
0
1
2
0
0
1200
40
0
400
1
2
0
0
-500 -400 -300 -200 -100 0 100 200 300 400 500
-500
-400
-300
-200
-100
0
100
200
300
400
500
Schefels function is very similar to constrained
problems. This is demonstrated on next figure where is
used only one dimension. This function has only local
optimums. Global optimum does not exist. But usually
this is not taken on mind. It is very common to test local
optimums on set (-Suu; Suu)
n
B. Rastrigins function
Rastrigins function is hills and valleys function. To
leave local optimum the mass points have to putt big
effort in point of mechanical energy.
(x) = (x
2
- 1ucos(2nx
) + 1u)
n
=1
. (7)
0ptimum is x
= u, (x) = u.
Interval x e (-S.12; S.12)
[X,Y] = meshgrid(-5.12:0.1:5.12,-5.12:0.1:5.12);
Z = 20 + X .* X + Y .* Y -10 * cos(2 * pi * X) -
10 * cos(2 * pi * Y);
Code 4. Rastrigins function
Fig. 7. Rastrigins function in 3D
Fig. 8. Rastrigins function in 3D
C. Griewanks function
Grienwanks function is function which has many
similar local optimums but there is only one global
optimum.
(x) = [
x
i
2
4000
n
=1
- cos [
x
i
n
=1
+ 1. (8)
Optimum is x
= u, (x) = u
Expected interval for solving is x
e (-6uu; 6uu)
[X,Y] = meshgrid(-600:10:600,-600:10:600);
Z = (X .* X + Y .* Y) / 4000 -cos(X) .* cos(Y /
sqrt(2)) + 1;
Code 5. Grienwanks function
Fig. 9. Grienwanks function in 3D
Fig. 10. Grienwanks function in 3D
IV. PROBLEM TRANSFORMATIONS
All but Schefels function are centered. Global
optimum is at x
= u
where is global optimum too. If the particles will just
move toward the centre they will reach global optimum.
This problem can be avoided by shifting.
g(x) = (x + x
s
). (9)
For common solvers (PSO, GSA, CFO) the shifting is
granting harder problem. This is basic reason why
Schefels function is too hard for some problems.
Experiments where the Schefels function was shifted so
expected optimum was at x