Vous êtes sur la page 1sur 16

Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

A Social Spider Algorithm for Global Optimization


James J.Q. Yu, Student Member, IEEE and Victor O.K. Li, Fellow, IEEE
Department of Electrical and Electronic Engineering
The University of Hong Kong
Email: {jqyu, vli}@eee.hku.hk

AbstractThe growing complexity of real-world problems has lives without interacting with others of their species. However,
motivated computer scientists to search for efcient problem- among the 35 000 spider species observed and described
solving methods. Metaheuristics based on evolutionary computa- by scientists, some species are social. These spiders live
tion and swarm intelligence are outstanding examples of nature-
inspired solution techniques. Inspired by the social spiders, we in groups, e.g. Mallos gregalis and Oecobius civitas. Based
propose a novel Social Spider Algorithm (SSA) to solve global on these social spiders, this paper formulates a new global
optimization problems. The framework is mainly based on the optimization method to solve optimization problems.
foraging strategy of social spiders, which utilize the vibrations Spiders are air-breathing arthropods. They have eight legs
spread over the spider web to determine the position of preys. and chelicerae with fangs. Spiders have been found worldwide
When tested against benchmark functions, SSA has superior
performance compared with other metaheuristics, including evo- and are one of the most diverged species among all groups of
lutionary algorithms and swarm intelligence algorithms. organisms. They use a wide range of different strategies for
foraging, and most of them detect prey by sensing vibrations.
Index TermsSocial spider algorithm, global optimization,
swarm intelligence, evolutionary computation, meta-heuristic. Spiders have long been known to be very sensitive to vibratory
stimulation, as vibrations on their webs notify them of the
capture of prey. If the vibrations are in a dened range
I. I NTRODUCTION of frequency, spiders attack the vibration source. The social
the fast growing size and complexity of modern spiders can also distinguish vibrations generated by the prey
W ITH
optimization problems, evolutionary computing is at-
tractive as an efcient tool for optimization. Among all the
with ones generated by other spiders [5]. The social spiders
passively receive the vibrations generated by other spiders on
methods devised by the evolutionary computation community, the same web to have a clear view of the web. This is one
the evolutionary algorithms (EAs), which mainly draw inspi- of the unique characteristics which distinguishes the social
ration from nature, are shown to be very successful methods spiders from other organisms as the latter usually exchange
for optimization. Currently several types of EAs have been information actively, which reduces the information loss to
widely employed to solve real world combinatorial or global some degree but increases the energy used for communication
optimization problems, including Genetic Algorithm (GA), [6].
Genetic programming (GP), Evolutionary Strategy (ES) and In this paper, inspired by the social behavior of the social
Differential Evolution (DE). These algorithms demonstrate spiders, especially their foraging behaviour, we propose a
satisfactory performance compared with conventional opti- new metaheuristic for global optimization: the Social Spider
mization techniques, especially when applied to solve non- Algorithm (SSA). The foraging behaviour of the social spider
convex optimization problems [1]. can be described as the cooperative movement of the spiders
In the past decade, swarm intelligence, a new kind of towards the food source position. The spiders receive and
evolutionary computing technique, has attracted much research analyse the vibrations propagated on the web to determine the
interest [2]. Swarm intelligence is mainly concerned with the potential direction of a food source [7]. In this process, the
methodology to model the behaviour of social animals and spiders cooperate with each other to move towards the prey.
insects for problem solving. Researchers devised optimization We utilize this natural behaviour to perform optimization over
algorithms by mimicking the behaviour of ants, bees, bacteria, the search space in SSA.
reies and other organisms. The impetus of creating such The group living phenomenon has been studied intensively
algorithms was provided by the growing needs to solve opti- in animal behaviour ecology. One of the reasons that ani-
mization problems that were very difcult or even considered mals gather and live together is to increase the possibility
intractable. of successful foraging and reduce the energy cost in this
Among the commonly seen animals, spiders have been a process [8]. In order to facilitate the analysis of social for-
major research subject in bionic engineering for many years. aging behaviour, researchers proposed two foraging models:
However, most research related to spiders focused on the information sharing (IS) model [9] and producer-scrounger
imitation of its walking pattern to design robots, e.g. [3]. To (PS) model [10]. The individuals under the IS model perform
the best of our knowledge, no spider-inspired algorithms have individual searching and seek for opportunity to join other
been proposed for solving optimization problems. A possible individuals simultaneously. In the PS model, the individuals
reason for this is that a majority of the spiders observed are divided into leaders and followers. Since there is no leader
are solitary [4], which means that they spend most of their in social spiders [11], it seems the IS model is more suitable,
Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

and we use this model to control the searching pattern of SSA. tendency. These two kinds of information correspond to cog-
The contribution of this paper is threefold: nitive learning and social learning, respectively, and lead the
1) We propose a brand new nature-inspired swarm in- population to nd a best way to perform optimization [13].
telligence algorithm based on social spiders. This The above two metaheuristics have been applied to solve a
population-based general-purpose metaheuristic demon- vast range of different problems, e.g. [14][15]. Besides the
strates outstanding performance in the global optimiza- research on these algorithm, swarm intelligence algorithm
tion benchmark tests. design has attracted many researchers and several new al-
2) Our proposed algorithm introduces a new social animal gorithms were devised. The most widely studied organism
foraging model to solve optimization problems. We in swarm intelligence is the bee [2]. Abbass proposed a
also incorporate the information loss schemes in the Marriage in honey Bees Optimization (MBO) in [16] and
algorithm. this algorithm was applied to solve propositional satisability
3) We perform a series of experiments to investigate the problems (3-SAT problems). In MBO, the mating ight of
impact of different parameters and searching schemes the queen bee is represented as the transitions in a state
on the performance of the algorithm. The result of these space (search space), with the queen probabilistically mating
experiments may serve as important inputs for further with the drone encountered at each state. The probability of
research. mating is determined by the speed and energy of the queen,
and the tness of the drone. Karaboga and Basturk proposed
The rest of this paper is organized as follows. We will an Articial Bee Colony optimization (ABC) in [17]. ABC
rst present some related work on swarm intelligence and classies the bees in a hive into three types: scout bees that
bio-inspired metaheuristics in Section II. Then we will for- randomly y without guidance, employed bees that search the
mulate and elaborate on SSA by idealizing and imitating the neighborhood of their positions, and onlooker bees that use the
foraging behaviour of social spiders in Section III. Section population tness to select a guiding solution for exploitation.
IV introduces the benchmark functions we use for testing the The algorithm balances exploration and exploitation by means
performance of SSA, with the experimental settings. Section of using employed and onlooker bees for local search, and the
V presents the simulation results of SSA on the benchmark scout bees for global search.
functions and the comparison with other popular metaheuris-
Besides the bees, other organisms have also been widely
tics. We will then discuss the performance of SSA and the
studied [2]. Krishnanand and Ghose proposed a Glow-worm
difference between SSA and other metaheuristics in Section
Swarm Optimization (GSO) [18] based on the behaviour of
VI. Finally we will conclude this paper in Section VII and
rey. In GSO, each rey randomly selects a neighbor
propose some future work.
according to its luminescence and moves toward it. In general
the reies are more likely to get interested in others that
II. BACKGROUND glow brighter. As the movement is only conducted locally
using selective neighbor information, the rey swarm is able
Swarm intelligence algorithms mimic the methods in na- to divide into disjoint subgroups to explore multiple optima.
ture to drive a search for the optimal solution. At the very Another rey-based technique is proposed by Yang et al.
beginning there are two major methods for this kind of [19]. He reformulated the co-movement pattern of reies
algorithms: ant colony optimization (ACO) [12] and particle and employed it in optimization. Passino devised a Bacterial
swarm optimization (PSO) [13]. Foraging Optimization (BFO) [20] based on the bacterial
ACO is inspired by the group foraging behaviour of ants, chemotaxis. In BFO, possible solutions to the optimization
whose goal is to nd a shortest path from their colony to problem are represented by a colony of bacteria. It consists of
food sources. In this metaheuristic, feasible solutions of the three schemes, i.e., chemotaxis, reproduction, and elimination-
optimization problem to be solved are represented by the paths dispersal. The exploitation task is performed using the rst two
between the colony and food sources. The ants communicate schemes and the last one contributes to exploration.
with and inuence others using pheromone, a volatile chemical
substance. When an ant nds a food source, it deposits certain
III. S OCIAL S PIDER A LGORITHM
amount of pheromone along the path and the amount is
positively correlated with the quality of the food source. The In SSA, we formulate the search space of the optimization
pheromone laid down biases the path selection of other ants, problem as a hyper-dimensional spider web. Each position
providing positive feedback. Using the scheme of positive on the web represents a feasible solution to the optimiza-
feedback, the algorithm leads the ants to nd the shortest path tion problem and all feasible solutions to the problem have
to a best food source [12]. corresponding positions on this web. The web also serves
PSO is motivated by the movement of organisms as a as the transmission media of the vibrations generated by the
group, as in a ock of birds or a school of shes. The spiders. Each spider on the web holds a position and the
group is represented by a swarm of particles and PSO uses quality (or tness) of the solution is based on the objective
their positions in the search space to represent the feasible function, and represented by the potential of nding a food
solutions of the optimization problem. PSO manipulates the source at the position. The spiders can move freely on the
movement of these particles to perform optimization, utilizing web. However, they can not leave the web as the positions
the information of individual experience and socio-cognitive off the web represent infeasible solutions to the optimization
Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

problem. When a spider moves to a new position, it generates value, i.e. larger for maximization or smaller for minimization
a vibration which is propagated over the web. Each vibration problem, corresponds to a larger vibration intensity.
holds the information of one spider and other spiders can get
the information upon receiving the vibration. C. Intensity Attenuation
As a form of energy, vibration attenuates over time and
A. Spider distance. This physical phenomenon is accounted for in the
The spiders are the agent of SSA to perform optimization. design of SSA by two equations.
At the beginning of the algorithm, a pre-dened number of 1) Attenuation over Distance: We dene the vibration
spiders are put on the web. Each spider s holds a memory, attenuation over distance as follows. We dene the distance
storing the following individual information: between spider a and b as D(Pa , Pb ), and the maximum
1) The position of s on the web. distance between two points in the search space as Dmax .
2) The tness of the current position of s. The denition of Dmax can be problem-dependent, and we
3) The target vibration of s in the previous iteration. use the following equation for simplicity:
The rst two types of information describe the individual
Dmax = ||x x||p , (2)
situation of s, while the third type of information is involved in
directing s to new positions. The detailed scheme of movement where x is the upper bound of the search space and x is the
will be elaborated in Section III-D. lower bound of the search space. p indicates that we use p-
Based on observations, spiders are found to have very norm as the method to calculate the distance between spiders,
accurate senses of vibration. Furthermore, they can separate i.e.,
different vibrations propagated on the same web and sense D(Pa , Pb ) = ||Pa Pb ||p . (3)
their respective intensities [11]. In SSA, a spider will generate
a vibration when it reaches a new position different from the In this paper we use 1-norm or Manhattan norm in distance
previous one. The intensity of the vibration is correlated with calculation. If the search space is not constrained, x and x
the tness of the position. The vibration will propagate over in Eqn. (2) stand for the upper and lower bound of the initial
the web and other spiders can sense it. In such a way, the solution generation space, respectively.
spiders on the same web share their personal information with With the above denitions, we dene the vibration attenu-
others to form a collective social knowledge. ation over distance as follows:
D(Pa , Pb )
I(Pa , Pb , t) = I(Pa , Pa , t) exp( ). (4)
B. Vibration Dmax ra
Vibration is a very important concept in SSA. It is one In the above formula we introduce a user-controlled param-
of the main characteristics that distinguish SSA from other eter ra (0, 1). This parameter controls the attenuation rate
metaheuristics. In SSA, we use two properties to dene a of the vibration intensity over distance. The larger ra is, the
vibration, namely the source position and the source intensity weaker the attenuation imposed on the vibration.
of the vibration. The source position is dened by the search 2) Attenuation over Time: We also introduce an equation
space of the optimization problem, and we dene the intensity to model vibration attenuation over time. As the vibration
of a vibration in the range [0, +). Whenever a spider moves biases other spiders to move, a non-decaying vibration may
to a new position, it generates a vibration at its current potentially attract other spiders continuously, causing the al-
position. We dene the position of spider a at time t as gorithm to converge pre-maturely. So the inuence of previous
P a (t), or simply as P a if the argument is t. We further use vibrations shall be properly attenuated to prevent pre-mature
I(P a , P b , t) to represent the vibration intensity sensed by a convergence. The vibration attenuation over time is dened as
spider at position P b at time t and the source of the vibration follows:
is at position P a . Thus I(P s , P s , t) denes the intensity of I(P a (t), P a (t), t + 1) = I(P a , P a , t) ra . (5)
the vibration generated by spider s at the source position. This
vibration intensity at the source position is correlated with the In each iteration, all vibrations generated in the previous
tness of this position f (P s ), and we dene the intensity value iteration are attenuated by the factor ra . We use the same
as follows: parameter ra introduced in the vibration attenuation over
 distance formula for ease of parameter tuning. At time t + 1,
1/(Cmax f (P s )) for maximization
I(P s , P s , t) = , the position of spider a may change to P a (t + 1), but the
1/(f (P s ) Cmin ) for minimization source position of the vibration remains at P a (t).
(1)
where Cmax is a condently large constant selected such that
all possible tness values of the maximization problem is D. Search Pattern
smaller than Cmax , and Cmin is a condently small constant Here we demonstrate the above ideas in terms of an algo-
such that all possible tness values of the minimization rithm. There are three phases in SSA: initialization, iteration,
problem is larger than Cmin . Equation 1 ensures that the and nal. These three phases are executed sequentially and
possible vibration intensities of any optimization problem are Fig. 1 is a complete ow chart of the algorithm. In each run
all positive values. It further guarantees that a better tness of SSA, we start with the initialization phase, then perform
Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

searching in an iterative manner, and nally terminate the The iteration phase loops until the stopping criteria is
algorithm and output the solutions found. matched. The stopping criteria can be dened as the maximum
In the initialization phase, the algorithm denes the ob- iteration number reached, the maximum CPU time used, the
jective function and its solution space. The value for the error rate reached, the maximum number of iterations with no
parameter used in SSA is also assigned. The symbols used improvement on the best tness value, or any other appropriate
in SSA are listed in Table I. After setting the values, the criteria. After the iteration phase, the algorithm outputs the
algorithm proceeds to create an initial population of spiders best solution with the best tness found. The above three
for optimization. As the total number of spiders remains phases constitute the complete algorithm of SSA and its
unchanged during the simulation of SSA, a xed size memory pseudo-code can be found in Algorithm 1.
is allocated to store their information. The positions of spiders
are randomly generated in the search space, with their tness Algorithm 1 S OCIAL S PIDER A LGORITHM
values calculated and stored. The target vibration of each 1: Assign values to the parameters of SSA.
spider in the population is set at its current position, and the 2: Create the population of spiders pop and assign memory
vibration intensity is zero. This nishes the initialization phase for them.
and the algorithm starts the iteration phase, which performs 3: Initialize vtar for each spider.
the search with the articial spiders created. 4: while stopping criteria not met do
In the iteration phase, a number of iterations are performed 5: for each spider s in pop do
by the algorithm. In each iteration, all spiders on the web 6: Evaluate the tness value of s.
move to a new position and evaluate their tness values. 7: Generate a vibration at the position of s.
The algorithm rst calculates the tness values of all the 8: end for
articial spiders on different positions on the web. Then these 9: for each spider s in pop do
spiders generate vibrations at their positions using Equation 1. 10: Calculate the intensity of the vibrations V
After all the vibrations are generated, the algorithm simulates generated by other spiders.
the propagation process of these vibrations using Equation 11: Select the strongest vibration vbest from V .
4. In this process, each spider s will receive popSize 1 12: if The intensity of vbest is larger than vtar then
different vibrations generated by other spiders. The received 13: Store vbest as vtar .
information of these vibrations include the source position of 14: end if
the vibration and its attenuated intensity. We use V to represent 15: Perform a random walk towards vtar .
these popSize 1 vibrations. Upon the receipt of V , s will 16: Generate a random number r from [0,1).
select the strongest vibration vbest from V and compare its 17: if r < pj then
intensity with the intensity of the target vibration vtar stored 18: Assign a random position to s.
in its memory. s will store vbest as vtar if the intensity of vbest 19: end if
is larger, otherwise the original vtar is retained. 20: Attenuate the intensity of vtar .
The algorithm then manipulates s to perform a random 21: end for
walk towards vtar . This random walk is conducted using the 22: end while
following equation: 23: Output the best solution found.

P s (t + 1) = P s + (P tar P s )  (1 R  R), (6)


where  denotes element-wise multiplication. P tar is the IV. B ENCHMARK P ROBLEMS AND E XPERIMENT S ETTING
vibration source position of the target vibration vtar . R is
a vector of random numbers generated from zero to one In order to benchmark the performance of SSA, we con-
uniformly, whose length is dim, and 1 is a vector of ones duct simulations on 20 different benchmark functions. These
of length dim. The algorithm repeats this process for all the benchmark functions are selected from the benchmark set
spiders in pop. proposed by Yao et al. [21] and the Competition on Real-
To avoid SSA getting stuck in a local optimum, we in- Parameter Single Objective Optimization Problems at CEC
troduce an articial spider jump away process. Each spider in 2013 [22]. The former benchmark set has been adopted for
pop, right after the random walk step, has a small probability to testing performance by a wide range of metaheuristics in
decide not to follow its present target and jump away from its recent years [23][24][25], and the latter one is the latest
current position. The probability is dened using the following benchmark set for the optimization competition organized at
equation: CEC 2013. The benchmark functions are listed in Table II and
rj can be classied into three groups:
pj = , (7)
exp(D(P s , P tar )/Dmax ) 1) Group I: f1 f6 are unimodal minimization functions.
where rj is a user-dened jump away rate parameter. If spider 2) Group II: f7 f14 are multimodal minimization func-
s is chosen to jump away, a new random position in the search tions.
space is generated and assigned as the new position of s. The 3) Group III: f15 f20 are shifted and rotated minimization
last step of the algorithm is to attenuate the intensity of the functions.
stored target vibration using Equation 5 and this concludes the Group I functions are used to test the fast-converging perfor-
iteration phase. mance of SSA. Group II functions all have a large number of
Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

local minima points, and can be used to test the ability of SSA are set as recommended by the author. We employ the GSO
to jump out of local optima and avoid pre-mature convergence. implementation provided by the author [37] and the parameters
Group III functions are more complex than other functions and remain unchanged.
can push the searching capability of SSA to a limit. They can The performance comparison among SSA and the above
also test the performance of SSA in solving shifted and rotated seven algorithms are made according to a rigorous non-
optimization problems. parametric statistical framework. All initial solutions in the
All benchmark functions are 30-dimensional minimization populations are generated randomly for each run of the al-
functions and the global minimum values are zero. SSA gorithm on each function. For each function, we rst test
is implemented in Python 2.7 under Microsoft Windows the hypothesis that all algorithms perform equally well by
7 operating system. All simulations are carried out on a adopting the Kruskal-Wallis one-way analysis of variance test
computer with an Intel Core i7-3770 @ 3.4GHz CPU. In [38]. If this hypothesis is rejected at 95% condence interval,
each run, we use a maximum number of 200 000 function the Kruskal-Wallis test suggests that at least one algorithm
evaluations as the termination criterion of SSA. In order to involved in the comparison is different from others. However,
reduce statistical errors and generate statistically signicant the Kruskal-Wallis one-way analysis of variance test alone
results, each function is repeated for 100 independent runs. cannot discover which algorithm(s) perform differently from
The mean, standard deviation, best, median, and worst results others. Thus we further pair-wise compare the simulation
of SSA are recorded. results using the Wilcoxon ranksum test with Sidak correction
In order to meet the requirement set by [22], we use one at 95% condence interval. Based on the pair-wise ndings,
xed combination of parameters for SSA in the simulation of we can construct partial orderings of the algorithms.
all groups of functions. The population size popSize is 50,
vibration attenuation rate ra is 0.9, and the jump away rate re V. S IMULATION R ESULTS
is 0.05. We will discuss the sensitivity of the parameters in In this section we present the simulation results of SSA
Section V-B. on the benchmark functions identied in Section IV. We
To evaluate the performance of our proposed algorithm, we perform comparison among SSA and other algorithms and give
compare the simulation results of SSA with seven widely-used statistical analysis on the simulation results. We also conduct
evolutionary algorithms and swarm intelligence algorithms, in- sensitivity analysis on the parameters introduced by SSA.
cluding the Genetic Algorithm (GA) [26], Particle Swarm Op- Finally the inuence of different search schemes incorporated
timizer (PSO) [13], Differential Evolution (DE) [27], Artical in SSA on the performance is demonstrated.
Bee Colony Optimization (ABC) [17], Firey Algorithm (FA)
[19], Cuckoo Search (CS) [28], and Group Search Optimizer
(GSO) [23]. All the compared algorithms are best-performing A. Comparison of SSA with other Algorithms
algorithms and give satisfactory performance. They have been The detailed simulation results of SSA on the benchmark
employed to compare the optimization performance of meta- introduced in Section IV are presented in Table III and the box
heuristics [23][24][29]. The benchmark functions are tested for plot of all raw data are shown in Fig. 2. It is worth mentioning
100 independent runs for each algorithm and the termination that although the Group I functions, i.e. unimodal optimization
criteria are also set to be a maximum number of 200 000 func- problems, are relatively easy and can be solved efciently by
tion evaluations. We employ the publicly available source code deterministic algorithms that utilize the gradient information
to implement and test the performance of these algorithms. We of the search space, they can effectively assess the convergence
use the Genetic Algorithm Optimization Toolbox (GAOT) [30] performance of EAs [23].
to test the performance of GA. The GA executed is real-coded The mean and standard deviation values obtained by SSA
with a heuristic crossover algorithm and uniform mutation and other algorithms on different benchmark functions are
strategy. Normalized geometric ranking scheme is employed listed in Table IV, with the Kruskal-Wallis one-way analy-
as the selection scheme. The population size is 50, and all sis of variance test p-value. The mean values in bold font
other parameters are set to be the default as recommended indicate superiority, while the p-values in italic suggest that
in [30]. For PSO, we employ the Particle Swarm Optimizer the Kruskal-Wallis test reject the hypothesis that all eight
toolbox (PSOt) [31] for simulation. The population size is 25, algorithms are performing similarly in the corresponding
acceleration factors c1 and c2 are both 2.0, and the decaying benchmark function. The simulation results indicate that SSA
inertia weight starts at 0.9 and ends at 0.4 after 1500 generally gives very outstanding performance compared with
iterations. These parameters are recommended in [31]. We other algorithms, especially in Group I functions which require
employ the DE source code provided by the author at [32]. a fast convergence speed. The Kruskal-Wallis test results of all
The DE model is DE/rand/1/bin for best adaptability of all algorithms on all the benchmark functions show that there is at
functions [33], NP is 50, F is 0.4717, and CR is 0.8803. The least one algorithm for each function that performs differently
source code for ABC was obtained from the authors webpage from the other algorithms compared. These test results allow
[34]. The population size is 50 and all other parameters are set us to proceed to the pair-wise Wilcoxon ranksum test with
as recommended in the source le. The source codes for FA Sidak correction in order to gure out partial orderings of the
and CS were obtained from MATLAB Central File Exchange algorithms for these benchmark functions.
system [35][36] submitted by the author of the algorithms. The outcomes of the pair-wise statistical comparisons for all
The population sizes are both 50 and all other parameters benchmark functions are shown in Table V with the position
Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

of SSA in bold. In this test, the pair-wise ranking values as PSO but the Wilcoxon ranksum test gives the opposite
well as the p-values of all 28 possible algorithm-pairs in each result. After investigation into the raw data of simulation
function are considered. If one of the p-values is larger than the results we found that while PSO gives better statistical
Sidak-corrected signicance value of 1 (95%)1/8 0.0064, result, SSA demonstrates a higher robustness. This can
we consider the two algorithms involved in the calculation also be observed from the standard deviation of these
of this p-value perform similarly on this function. Otherwise, two algorithms. We consider that PSO has a faster con-
the ranking values of the two algorithms are compared. The vergence speed in this function but this hurts its ability of
algorithm with a lower ranking is regarded as the better avoiding pre-mature convergence. This phenomenon is
algorithm in solving the corresponding function. If algorithms reproduced in f8 , f16 , and f17 , where SSA outperforms
A, B, and C perform better than algorithm D, we conclude that other algorithms in robustness but has slightly worse
all of A, B, and C outperform D on the particular benchmark statistical assessment results.
function. However, if A outperforms B but both A and B 3) For Group II functions, SSA shows an outstanding
perform similarly with C, all of A, B, and C are placed in performance compared with other algorithms, with SSA
the same rank. The order of algorithm within one rank is topping the mean value comparison in 4 functions and
alphabetical and has no implication on performance. In Table statistical assessment in 3 functions. It also generates
V, A < B stands for A performs better than B, and C satisfactory results in other functions.
D stands for C and D performs similarly on the specied 4) In Group III functions, SSA still maintains its superior
benchmark function. position in 5 out of the total of 6 functions in sta-
The main reason to adopt this statistical assessment method tistical assessment, ranking the rst or second in the
instead of the canonical mean-comparison method is to reduce comparison. The advantage is enhanced in the mean
the impact of some outliers on the performance assessment. value comparison, where it gets three rst places. This
This impact can be observed from the simulation result of phenomenon again demonstrates the dominant position
SSA and PSO on f1 , where the mean of the simulation results of SSA in terms of both best solution quality and
of SSA was signicantly better than PSO. However, after robustness.
a careful observation of the raw simulation data, we found In order to have a clearer view of the statistical assessment
that despite all SSAs simulation results falling in the range of the simulation results, we further rank the algorithms based
[5.1e-35 3.7e-32], a majority (94%) of the results of PSO on their performance in the Wilcoxon ranksum test. For each
were within a better range [7.96e-132 7.61e-115]. However, an function, the rst algorithm is assigned with a rank value 1, the
outlier point at 8.22e-22 greatly inuenced the overall average second with 2, etc. For the ties, an average rank is given to the
of the results, and this was unfair as we usually conduct a algorithms involved in the tie. Take f19 as an example, DE, FA,
large number of runs in real-world applications and adopt GSO, and PSO shares the same rank, and they are the third to
the best result. This negative impact is greatly reduced in the sixth algorithms in this function. So we assign an average
our adopted statistical assessment method, where the ranking rank value of 4.5 to all these four algorithms. We then sum
information is used in the calculation instead of the real values up the rank values of each algorithm in each group to have
of the results [39]. Another reason of using Wilcoxon rank an overall assessment of the performance of the algorithms
testsum instead of the more commonly employed Students over each category of functions. We nally sum up all rank
t-test is that the former test is a kind of non-parametric values to evaluate the performance of the algorithms in solving
statistical hypothesis tests, which do not make assumptions general optimization problems. Similar evaluation methods has
on the characteristic structure or parameters on the input raw been adopted in previous metaheuristic algorithm tests [24].
data. To the best of our knowledge, there is no theoretical The test results are presented in Table VI.
model designed for the stochastic simulation results generated The rank summary of the statistical assessment result
by evolutionary algorithms or swarm intelligence algorithms, supports our previous observations. In all three groups of
Wilcoxon ranksum test is more suitable than the Students t- functions, SSA is the best performing algorithm. It possesses
test, which assumes that the test population follows a normal a dominating position in the overall comparison as well. From
distribution [40]. Similar statistical assessment methods have Table VI we also notice that no other algorithm can have
been employed to evaluate a number of different optimization as stable a performance as SSA. The performance of DE
algorithms [41][42]. and ABC is satisfactory in Groups I and II, but neither of
From the simulation result comparison in Table IV and the them can catch up with SSA in Group III functions. FA is
statistical assessment results in Table V, the following key comparable to SSA when solving difcult shifted and rotated
points can be observed: multimodal functions, but its performance in unimodal and
1) SSA gives superior performance in the simulation of all original multimodal functions ranks at the bottom of the list.
functions in Group I, where SSA is the best perform- So we conclude that in general SSA has both best optimization
ing algorithm in f2 f6 in terms of both mean value performance and highest stability.
of the results and the Wilcoxon ranksum test result.
This indicates the outstanding capability of SSA in fast
converging to the global optimum while avoiding pre- B. Parameter Sensitivity Analysis
mature convergence. Choosing proper parameter settings of SSA for numerical
2) In f1 , the average result of SSA is better than that of and real-world optimization problems can be time-consuming.
Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

The trial-and-error scheme, or a parameter sweep test, may settings. We regard a parameter setting as a signicantly
reveal the best achievable performance over the parameter superior one if it is in the rst rank of all the functions in
space at the expense of high computational cost. In real- one group.
world optimization problems, it may take far more time to The parameter sensitivity analysis results are presented in
evaluate the tness functions than to evaluate our benchmark Table VII. In this table, those parameter settings that lead to
problems. In such cases, one execution run may take days signicantly better performance over others with respect to the
or even months to nish, which makes trial-and-error scheme three groups of benchmark functions are marked with + and
impractical for parameter tuning. As alternatives, researchers the performances corresponding to the three groups are sep-
have proposed some schemes to replace the trial-and-error arated by /. - stands for signicantly worse performance
parameter selection scheme. These schemes can generally be of the parameter settings over the corresponding group. It can
classied into three groups [43]: be observed that medium popSize values (e.g., 40, 50, or 60),
1) Fixed parameter schemes select a parameter combina- large ra values (e.g. 0.9), and small rj values (e.g. 0.05 or 0.1)
tion before the simulation using empirical or theoreti- can lead to signicantly better optimization performance than
cal knowledge of the characteristics of the parameters. other parameter settings in general. It can also be observed
This combination remains constant throughout the whole that the most sensitive parameter is rj , and a small change
search [24][44]. of this parameter may result in a big performance change.
2) Deterministic parameter schemes use some pre-dened popSize and ra have relatively less impact but still control the
rules to change the parameter values throughout the performance of the algorithm. This conclusion gives the rules
search [24][25]. of thumb of selecting parameters for different optimization
3) Adaptive parameter schemes change the parameter val- problems. Note that we are not claiming that the provided
ues by adaptively learning the impact of changing pa- parameter settings can generate the best performance on any
rameters on the searching performance throughout the problems. The provided settings may act as a guideline when
search [45]. Some schemes encode the parameters into one starts selecting parameters for SSA. However, a proper
the solution and evolve the parameters together with the parameter tuning process is still necessary in order to get the
population [46]. best achievable performance.

In this paper we use the xed parameter scheme to test


the performance of SSA compared with other algorithms. C. Investigation of the Effect of other Schemes
We also use this scheme to perform a parameter sensitivity
analysis in order to deduce some rules of thumb on choosing Apart from the proper parameter selection, the search-
parameters that can consistently lead to satisfactory results on ing schemes incorporated into SSA also contribute to its
a wide range of functions with different characteristics. This superior performance. In order to investigate the effect of
test can also discover some of the features of the parameters these schemes, we also perform simulations with the SSA-
when solving different kinds of optimization problems. We variant algorithms without these schemes and compare the
carry out extensive simulations on our benchmark functions, results with the SSA result. The two most important schemes
which cover a wide range of optimization problems. Thus, the that distinguish SSA from other evolutionary algorithms and
derived rules of thumb can be expected to give generally good swarm intelligence algorithms (the detailed differences will be
performance on unknown problems. discussed in Section VI) are the vibration propagation scheme
and the spider jump away scheme. Thus we have three versions
There are three parameters for SSA, namely popSize, ra ,
of SSA for comparison:
and rj . We perform a parameter sweep test on all 20 bench-
mark functions presented in Section IV using 144 different 1) SSA is the original version of the proposed algorithm
parameter combinations: [popSize, ra , rj ][20, 30, 40, 50, with both the vibration propagation scheme and the
60, 70, 80, 90, 100, 150, 200, 300][0.1, 0.5, 0.9][0.01, 0.05, spider jump away scheme.
0.1, 0.3]. For each parameter combination, 100 independent 2) SSA-vbr is an SSA-variant algorithm which removes
runs are conducted for each function. In order to reduce the the vibration propagation scheme. In this algorithm the
impact of the stochastic process of SSA on the performance vibration (individual information) will not attenuate. In
of different combinations, we use the same 100 random seeds this scheme, the social knowledge is complete.
to generate random numbers for all the tests. This issue was 3) SSA-jmp is an SSA-variant algorithm which removes the
also addressed in [43]. spider jump away scheme. In this algorithm the spiders
In order to have a statistically signicant conclusion, we also will always be cooperative no matter how closely they
use the statistical assessment method introduced in Section are located.
V-A. All the raw data will go through the Kruskal-Wallis We perform simulations of these three algorithms on all the
one-way analysis of variance test to see whether there are benchmark functions. The simulation results will be processed
combinations that perform signicantly differently from others using a similar statistical assessment method as in Section
at a 95% condence interval. If so, the Wilcoxon ranksum test V-A. In this test, as we are more interested in the performance
is employed to discover the signicantly different parameter improvement of SSA compared with SSA-vbr and SSA-
settings at a 95% condence interval. With the processed jmp, we will only perform the Wilcoxon ranksum test on
information we can construct partial orderings of the parameter SSA/SSA-vbr and SSA/SSA-jmp algorithm pairs. This simple
Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

statistical test alone allows us to conclude whether there is any jump away scheme, which contributes to the capability of SSA
improvements or not. to jump out of local optimum in the search space.
The simulation results of SSA, SSA-vbr, and SSA-jmp are
presented in Table VIII. We can see that SSA outperforms Although both SSA and ACO draw their inspiration from
SSA-vbr and SSA-jmp in all 20 benchmark functions in terms the social animal foraging strategy, there are still some obvious
of mean values obtained (SSA and SSA-vbr ties in f4 , where differences. The foraging frameworks adopted by the two
all the 200 runs generate the global optimum at zero). The algorithms are quite different: ACO utilize the ant foraging
Wilcoxon ranksum test result also supports this conclusion. features to perform optimization. Ants nd food by laying
All p-values except for SSA/SSA-vbr pair on f4 are smaller down pheromone trails and collectively establish positive
than the signicant level of 0.05, which indicates that SSA feedbacks which bias the later path selection, while spiders
signicantly outperforms SSA-vbr and SSA-jmp statistically. sense the vibration propagated by the spider web to locate
At the same time, we can observe that the performance of the prey. Another difference is the presentation of feasible
SSA-vbr is better than SSA-jmp. This phenomenon is strong solutions. In SSA we use the positions on the spider web
evidence of the conclusion we made in Section V-B that the to represent feasible solutions. Similar representations have
parameter rj is relatively more sensitive than ra . This result also been widely adopted in the swarm intelligence algorithm.
reveals the necessity of introducing of these two important Meanwhile, ACO uses the path between the ant hive and food
schemes into SSA. sources to represent solutions to the optimization problems.
Additionally, ACO was originally designed to solve combi-
natorial problems. Although in recent years there are ACO-
VI. D ISCUSSION variant algorithms designed mainly to solve continuous prob-
A number of swarm intelligence algorithms have been lems [47], the performance is not as good as the performance
proposed in the past few decades. Among them, PSO and ACO of the original ACO in solving combinatorial problems like the
are the two most widely employed and studied algorithms. Travelling Salesman Problem. There are also the information
Although SSA also belongs to the scope of swarm intelligence propagation and searching pattern differences between SSA
algorithms, it has many differences from the previous ones. and ACO as described above.
PSO, like SSA, was originally proposed for solving con-
There are also some other swarm intelligence algorithms
tinuous optimization problems. It was also inspired by animal
proposed to solve continuous problems, and SSA has some
behaviour. However, a most important difference between SSA
unique characteristics. In most swarm intelligence algorithms,
and PSO is due to their different biology backgrounds. PSO
e.g., ABC and GSO, the populations are structured into
was designed based on the model of coordinated group animal
different types. Different types of individuals perform different
motions of ocks of birds or schools of shes. This model
jobs and the whole population cooperates to search the solution
serves as the design metaphor of PSO. SSA is inspired by the
space. However in SSA, all individuals (spiders) are equal.
social spider foraging strategy, which belongs to the scope of
Each performs all the tasks that would be executed by multiple
general social animal searching behaviour. We use a general
types of the populations in other algorithms. If we put SSA
IS model (see Section I) as the designing framework. This
into the conventional framework, it has the feature that the
difference is also a major distinguishing feature of SSA from
different types of individuals can shift very smoothly and
other proposed algorithms. A second difference between SSA
without the guidance of the user, which may potentially
and PSO is the information propagation method. In PSO, the
contribute to the performance improvement.
information propagation method is neglected, and each particle
is assumed to be aware of all the information of the system Despite that EAs like GA and ES are also population-based
without loss. In SSA we carefully model the information prop- algorithms, which inevitably share some similarities with the
agation process through the vibrations on the spider web. This population-based SSA, they are quite different general-purpose
process forms a general knowledge system with information metaheuristics. They are inspired by completely different
loss. Although there is still no research on how the information biology disciplines. EAs usually employ different recombina-
loss will impact the social foraging strategy employed in tion and decomposition operators to manipulate the solutions,
optimization, it is possible that this information loss system which imitate the regeneration of an organism.
partially contributes to the performance improvement of SSA
over PSO. Another difference is that in PSO, the common The above comparison between SSA and some other swarm
knowledge of the group is all about the best particle in the intelligence algorithms and EAs may potentially reveal the
system. All remaining particles in the system do not constitute reason of the superior performance of SSA. As stated above,
the shared information of the group, which may lead to although we still do not know the exact impact of information
neglecting some valuable information of the population. In loss on the optimization process, this feature of SSA may
SSA, each spider generates a new piece of information and contribute to the optimum search in some complex multimodal
propagates the information to the whole population. Last but optimization problems. The uniform structure of the popula-
not least, the searching behaviour of SSA and PSO are quite tion is another potential advantage of SSA. Also, the unique
different. Although they both employ random walk, the target searching pattern and its underlying social animal foraging
of the movement is generated using very different methods. strategy as well as the IS foraging model contribute to the
Besides the random walk scheme, SSA also have an additional overall performance improvement of SSA.
Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

VII. C ONCLUSION SSA to control the parameters and reduce the effort in tuning
parameters. Last but not least, it would be interesting to
In this paper we proposed a novel social spider algorithm to identify real-world applications which can be addressed using
solve global optimization problems. This algorithm is based on SSA effectively and efciently.
the foraging behaviour of social spiders and the information-
sharing foraging strategy. SSA is conceptually simple and
relatively easy to implement. SSA can tackle a wide range R EFERENCES
of different continuous optimization problems and has the [1] E.-G. Talbi, Metaheuristics: From Design to Implementation. Wiley,
potential to be employed to solve real-world problems. 2009.
In order to evaluate the performance of SSA, we adopted [2] R. S. Parpinelli and H. S. Lopes, New inspirations in swarm intelli-
gence: a survey, Int. J. Bio-Inspired Computation, vol. 3, no. 1, pp.
a set of 20 benchmark functions which cover a large variety 116, Jan. 2011.
of different optimization problem types. We compared SSA [3] M. Yim, Y. Zhang, and D. Duff, Modular robots, IEEE Spectrum,
with some widely used particle swarm intelligence algorithms vol. 39, no. 2, pp. 3034, Aug. 2002.
[4] R. Foelix, Biology of Spiders. 198 Madison Ave. NY, New York, 10016:
and EAs, namely GA, PSO, DE, ABC, FA, CS, and GSO. Oxford University Press, 1996.
These algorithms have been employed to solve a large set [5] C. F. Schaber, S. N. Gorb, and F. G. Barth, Force transformation
of different benchmark optimization functions and real-world in spider strain sensors: White light interferometry. J. Royal Society
Interface, vol. 9, no. 71, pp. 12541264, Jun. 2012.
problems, and demonstrated outstanding performance. To get [6] R. Cocroft, The public world of insect vibrational communication,
statistically signicant conclusions, we applied the Kruskal- Molecular Ecology, vol. 10, pp. 20412043, May 2011.
Wallis one-way analysis of variance test and the Wilcoxon [7] F. Fernndez Campn, Group foraging in the colonial spider parawixia
bistriata (araneidae): effect of resource levels and prey size, Animal
ranksum test to process the simulation data. The results show Behaviour, vol. 74, no. 5, pp. 15511562, Nov. 2007.
that the performance of SSA is outstanding compared with [8] J. House, K. Landis, and D. Umberson, Social relationships and health,
the above listed algorithms in all three different groups of Science, vol. 241, no. 4865, pp. 540545, 1988.
[9] C. W. Clark and M. Mangel, Foraging and ocking strategies: Informa-
functions including unimodal, multimodal, and shifted-rotated tion in an uncertain environment, The American Naturalist, vol. 123,
multimodal optimization problems. This conclusion was sup- no. 5, pp. 626641, 1984.
ported by both the simulation results and the statistics of the [10] C. Barnard and R. Sibly, Producers and scroungers: A general model
and its application to captive ocks of house sparrows, Animal Be-
simulation data. haviour, vol. 29, no. 2, pp. 543550, May 1981.
We also conducted a parameter sensitivity analysis to reveal [11] G. Uetz, Foraging strategies of spiders, Trends in Ecology and Evo-
the impact of different parameters on the performance of SSA. lution, vol. 7, no. 5, pp. 155159, 1992.
[12] M. Dorigo, Optimization, learning and natural algorithms, Ph.D.
In this test we selected 144 parameter combinations for SSA dissertation, Politecnico di Milano, Italie, 1990.
and perform simulations with these combinations individually. [13] J. Kennedy and R. Eberhart, Particle swarm optimization, in Proc.
The results indicate that a medium population size, a large IEEE Int. Conf. Neural Networks, Perth, WA, U.S., Nov. 1995, pp. 1942
1948.
vibration attenuation rate, and a small jump away rate can [14] T. Liao, D. Molina, T. Stutzle, M. Oca, and M. Dorigo, An ACO
potentially generate satisfactory performance on a majority of algorithm benchmarked on the bbob noiseless function testbed, in Proc.
the optimization problems. This conclusion provides guide- 14th Int. Conf. GECCO, Philadelphia, U.S., Jul. 2012, pp. 221228.
[15] C. Voglis, G. S. Piperagkas, K. E. Parsopoulos, D. G. Papageorgiou, and
lines on the algorithm design in other applications. We also I. E. Lagaris, MEMPSODE: comparing particle swarm optimization
discover the different sensitivity levels of the three parameters and differential evolution within a hybrid memetic global optimization
on the nal performance. In addition, we investigated the framework, in Proc. 14th Int. Conf. GECCO, Philadelphia, U.S., Jul.
2012, pp. 253260.
performance inuence of different schemes incorporated in
[16] H. A. Abbass, MBO: marriage in honey bees optimization-a hap-
SSA. The test results verify the contribution of these schemes lometrosis polygynous swarming approach, in Proc. IEEE Congress on
on the superior performance of SSA. Evolutionary Computation (CEC), Seoul, Korea, May 2001, pp. 207
214.
Future research on SSA can be divided into three cate- [17] D. Karaboga and B. Basturk, A powerful and efcient algorithm for
gories: scheme research, algorithm research, and real-world numerical function optimization: articial bee colony, J. Global Optim.,
application. The random walk scheme and the jump away vol. 39, no. 3, pp. 459471, Nov. 2007.
[18] K. Krishnanand and D. Ghose, Detection of multiple source locations
scheme in the current SSA may be further improved using using a glowworm metaphor with applications to collective robotics, in
advanced optimization techniques and hybrid algorithms with Proc. IEEE Swarm Intell. Symposium, Pasadena, CA, U.S., Jun. 2005,
deterministic heuristics or local search algorithms. The jump pp. 8491.
[19] X.-S. Yang, Nature-inspired metaheuristic algorithms. Luniver Press,
away scheme incorporated in SSA now is a reallocation 2008, ch. 10. Firey Algorithm, pp. 8196.
scheme. However, research on how to manipulate the new [20] K. M. Passino, Biomimicry of bacterial foraging for distributed op-
positions of the jump away spiders is also an interesting timization and control, IEEE Control Syst. Mag., vol. 22, no. 3, pp.
topic. New schemes can also be applied in the searching 5267, Jun. 2002.
[21] X. Yao, Y. Liu, and G. Lin, Evolutionary programming made faster,
process of SSA for performance improvement. In terms of IEEE Trans. Evol. Comput., vol. 3, no. 2, pp. 82102, Aug. 1999.
algorithm research, SSA has the potential to be applied to [22] J. J. Liang, B.-Y. Qu, P. N. Suganthan, and A. G. Hernndez-Daz,
solve combinatorial problems. We note that some other swarm Problem denitions and evaluation criteria for the CEC 2013 special
session and competition on real-parameter optimization, Computational
intelligence algorithms like PSO and ABC originally designed Intelligence Laboratory, Zhengzhou University, Zhengzhou China and
to solve continuous optimization problems have been suc- Nanyang Technological University, Singapore, Technical Report 201212,
cessfully modied to solve combinatorial problem [48][49]. 2013.
[23] S. He, Q. H. Wu, and J. R. Saunders, Group search optimizer: An
Although SSA only has three parameters, it is still very optimization algorithm inspired by animal searching behavior, IEEE
interesting to develop adaptive or self-adaptive schemes for Trans. Evol. Comput., vol. 13, no. 5, pp. 973990, Aug. 2009.
Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

10

[24] A. Y. S. Lam, V. O. K. Li, and J. J. Q. Yu, Real-coded chemical reaction James J.Q. Yu Biography here.
optimization, IEEE Trans. Evol. Comput., vol. 16, no. 3, pp. 339353,
2012.
[25] W.-N. Chen, J. Zhang, Y. Lin, N. Chen, Z.-H. Zhan, H. S.-H. Chung,
Y. Li, and Y.-H. Shi, Particle swarm optimization with an aging leader PLACE
and challengers, IEEE Trans. Evol. Comput., vol. 17, no. 2, pp. 241 PHOTO
258, Apr. 2013. HERE
[26] J. H. Holland, Adaptation in Natural and Articial Systems. University
of Michigan Press, 1975.
[27] R. Storn and K. Price, Differential evolution: A simple and efcient
heuristic for global optimization over continuous spaces, J. Global
Optim., vol. 11, no. 4, pp. 341359, Dec. 1997.
[28] X.-S. Yang and S. Deb, Engineering optimisation by cuckoo search,
International Journal of Mathematical Modelling and Numerical Opti-
misation, vol. 1, no. 4, pp. 330343, 2009.
[29] W. Gong, Z. Cai, C. X. Ling, and H. Li, A real-coded biogeography-
based optimization with mutation, Appl. Math. Comput., vol. 216, no. 9,
pp. 27492758, Jul. 2010.
[30] C. H. North, J. A. Joines, and M. G. Kay, A genetic algorithm for
function optimization: A MATLAB implementation, North Carolina
State Univ., Raleigh, NC, Technical Report NCSU-IE-TR-95-09, 1995.
[31] B. Birge, PSOt - a particle swarm optimization toolbox for use with
MATLAB, in Proc. IEEE Swarm Intell. Symposium, Indianapolis,
Indiana, USA, Apr. 2003, pp. 182186.
[32] R. Storn, Differential evolution homepage, accessed 1-July-2013.
[Online]. Available: http://www1.icsi.berkeley.edu/storn/code.html
[33] E. Mezura-Montes, J. Velazquez-Reyes, and C. A. C. Coello, A com-
parative study of differential evolution variants for global optimization,
in Proc. 8th Int. Conf. GECCO, Seattle, WA, U.S., Jul. 2006, pp. 485
492.
[34] D. Karaboga, Articial bee colony homepage, accessed 1-July-2013.
[Online]. Available: http://mf.erciyes.edu.tr/abc/
[35] X.-S. Yang, Firey algorithm, accessed 1-July-2013. [On-
line]. Available: http://www.mathworks.com/matlabcentral/leexchange/
29693-rey-algorithm
[36] , Cuckoo algorithm, accessed 1-July-2013. [On-
line]. Available: http://www.mathworks.com/matlabcentral/leexchange/
29809-cuckoo-search-cs-algorithm
[37] S. He, Group search optimizer, accessed 1-July-2013. [Online].
Available: http://www.cs.bham.ac.uk/szh/software.xhtml
[38] M. Hollander and D. A. Wolfe, Nonparametric Statistical Methods.
Victor O.K. Li Biography here.
Wiley, 1999.
[39] Y. Maesono, Competitors of the wilcoxon signed rank test, Annals
of the Institute of Statistical Mathematics, vol. 39, no. 1, pp. 363375,
1987. PLACE
[40] J. F. Reed, III, Contributions to two-sample statistics, Journal of PHOTO
Applied Statistics, vol. 32, no. 1, pp. 3744, 2005. HERE
[41] D. Shilane, J. Martikainen, S. Dudoit, and S. J. Ovaska, A general
framework for statistical performance comparison of evolutionary com-
putation algorithms, Information Sciences, vol. 179, pp. 28702879,
2008.
[42] R. Wang, R. C. Purshouse, and P. J. Fleming, Preference-inspired co-
evolutionary algorithms for many-objective optimisation, IEEE Trans.
Evol. Comput., vol. 17, no. 4, pp. 474494, 2013.
[43] A. K. Qin and X. Li, Differential evolution on the CEC-2013 single-
objective continuous optimization testbed, in Proc. IEEE Congress
on Evolutionary Computation (CEC), Cancun, Mexico, Jun. 2013, pp.
10991106.
[44] K. Price, R. M. Storn, and J. A. Lampinen, Differential Evolution - A
Practical Approach to Global Optimization. Springer, 2005.
[45] A. Qin, V. Huang, and P. Suganthan, Differential evolution algorithm
with strategy adaptation for global numerical optimization, IEEE Trans.
Evol. Comput., vol. 13, no. 2, pp. 398417, 2009.
[46] J. Vrugt, B. Robinson, and J. Hyman, Self-adaptive multimethod search
for global optimization in real-parameter spaces, IEEE Trans. Evol.
Comput., vol. 13, no. 2, pp. 243259, 2009.
[47] K. Socha and M. Dorigo, Ant colony optimization for continuous
domains, European Journal of Operational Research, vol. 185, no. 3,
pp. 11551173, 2008.
[48] J. Kennedy and R. C. Eberhart, A discrete binary version of the particle
swarm algorithm, in Proc. IEEE Int. Conf. Syst. Man Cyber., Orlando,
FL, U.S., Oct. 1997, pp. 41044108.
[49] Q.-K. Pan, M. F. Tasgetiren, P. Suganthan, and T. J. Chua, A discrete
articial bee colony algorithm for the lot-streaming ow shop scheduling
problem, Information Sciences, vol. 181, no. 12, pp. 24552468, 2011.
Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

11

Initialize parameters
START
and the population

Calculate the fitness


values of the population

Generate vibrations at
the spiders' positions

Propagate the
vibrations over the web

Select the best received


vibrations

Is this vibration stronger than


the previous target?
Yes No

Use the selected best Attenuate the previous


vibration as new target target vibration

Perform random walk


to the target vibration

Shall the spider escape


from others?
Yes No

Randomly assign a new


position

Attenuate the previous


best vibration

No
Stopping criteria met?

Yes

Output the best solution STOP

Fig. 1. Flow chart of Social Spider Algorithm.


Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

12

0
10

10
10
50
10 10
10

20
10
20
10
100
10

30
10 30 0
10 10

FA
FA

FA

SSA

GA

DE

ABC

CS

GSO
PSO
FA

SSA

GA

DE

ABC

CS

GSO

SSA

GA

DE

ABC

CS

GSO
PSO

PSO
SSA

GA

DE

ABC

CS

GSO
PSO

(a) f1 (b) f2 (c) f3 (d) f4


2
10 0
10
0
10

5
10
5
10 5
0 10
10
10
10
10
10
10
10 15
2
10
10
15
10
FA

FA
FA

FA
SSA

GA

DE

ABC

CS

GSO
PSO

SSA

GA

DE

ABC

CS

GSO
PSO
SSA

GA

GA
DE

ABC

CS

GSO

SSA

DE

ABC

CS

GSO
PSO

PSO
(e) f5 (f) f6 (g) f7 (h) f8
0
10
0
10

10 10
10 10 10
10

5
10

20 20 20
10 10 10
10
10

30 30 30
10 10 10
FA

FA

FA

FA
SSA

GA

GA
DE

ABC

CS

GSO

SSA

DE

ABC

CS

GSO

SSA

GA

DE

ABC

GA
CS

GSO

SSA

DE

ABC

CS

GSO
PSO

PSO

PSO

PSO

(i) f9 (j) f10 (k) f11 (l) f12


10
10 0
0 10
10
0
10
5
5 10
10

5 5
10 10 10
10
0
10

15
10
FA

FA

FA

FA
SSA

GA

GA
DE

ABC

CS

GSO

SSA

DE

ABC

CS

GSO
PSO

PSO

SSA

GA

GA
DE

ABC

CS

GSO

SSA

DE

ABC

CS

GSO
PSO

PSO

(m) f13 (n) f14 (o) f15 (p) f16

0
10
8 0
10 10

10
10 6
2 10
10 2
10

20 4
10 10
4
10
2
30 10
10
FA

FA

FA
FA

SSA

GA

GA
DE

ABC

CS

GSO

SSA

DE

ABC

CS

GSO
PSO

PSO

SSA

GA

DE

ABC

CS

GSO
PSO
SSA

GA

DE

ABC

CS

GSO
PSO

(q) f17 (r) f18 (s) f19 (t) f20

Fig. 2. Box Plot of Simulation Results.


Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

13

TABLE I
S YMBOLS U SED IN SSA

Type Symbol Algorithmic Meaning Biology Meaning


Function f The objective function to be optimized. The spider web.
Variable dim The dimension of the objective function, i.e. the length The dimension of the articial spider web.
of a feasible solution to the objective function.
pop The population of spiders for optimization.
Parameter popSize The size of the population, or the number of spiders in The number of spiders on the web.
the population.
ra Vibration attenuation rate. Vibration attenuation rate on the spider web.
rj Spider jump away rate. Probability of spiders to jump away from others.

TABLE II
B ENCHMARK F UNCTIONS

Group Function Search Space Name n fmin


n
f1 (x) = x2i [-100 100]n Sphere Model 30 0
i=1n n
f2 (x) = |xi | + |xi | [-10 10]n Schwefels Problem 2.22 30 0
I n
i=1 i=1
f3 (x) = x21 + x2i [-10 10]n Cigar Function 30 0
n i=2
f4 (x) = (xi + 0.5)2 [-100 100]n Step Function 30 0
i=1n
f5 (x) = ix4i + rand() [-1.28 1.28]n Quadratic Function with Noise+ 30 0
i=1n1 2
f6 (x) = [xi + 2x2i+1 0.3 cos(3xi ) 0.4 cos(4xi+1 ) + 0.7] [-100 100]n Bohachevsky Function 30 0
i=1 
1 n 1 n
f7 (x) = 20 exp(0.2 x2 ) exp[ cos(2xi )]
n i=1 i n i=1 [-32 32]n Ackley Function 30 0
+ 20 + e
n
1 n  xi
f8 (x) = x2i cos( ) + 1 [-600 600]n Griewank Function 30 0
II 4000 i=1
i=1
i
n1
2
f9 (x) = sin (y1 ) + [(yi 1) (1 + 10(sin2 yi+1 ))]+
2
i=1
1 [-10 10]n Levy Function 30 0
(yn 1)2 (1 + sin2 (2yn )), yi = 1 + (xi + 1)
 4
1 n1
f10 (x) = [sin2 (3x1 ) + (xi 1)2 (1 + sin2 (3xi+1 ))+
10 i=1
n
(xn 1)2 (1 + sin2 (2xn ))] + u(xi , 5, 100, 4)
i=1
[-50 50]n Penalized Function No. 1 30 0

k(x i a) m for x i >a
u(xi , a, k, m) = 0 for a xi a

k(x a)m for x < a
i i
n1
2
f11 (x) = [10 sin (y1 ) + (yi 1)2 (1 + 10 sin2 (yi+1 ))+
n i=1
n [-50 50]n Penalized Function No. 2 30 0
2 1
(yn 1) ] + u(xi , 10, 100, 4), yi = 1 + (xi + 1)
n i=1 4
f12 (x) = (x2i 10 cos(2xi ) + 10) [-5.12 5.12]n Rastrigin Function 30 0
i=1
n1
f13 (x) = (100(xi+1 x2i )2 + (xi 1)2 ) [-30 30]n Rosenbrock Function 30 0
i=1 
n1
f14 (x) = [(n 1) ( yi + sin(50yi0.2 ) yi )]2 , yi = x2i + x2i+1 [-100 100]n Schaffer Function 30 0
n n
i=1
f15 (x) = |zi | + |zi |, z = M (x o) [-10 10]n SR Schwefels Problem 2.22* 30 0
i=1 i=1
n
1 n 2
 zi
f16 (x) = z cos( ) + 1, z = M (x o) [-600 600]n SR Griewank Function* 30 0
III 4000 i=1 i i
i=1
n1
f17 (x) = sin2 (y1 ) + [(yi 1)2 (1 + 10(sin2 yi+1 ))]+
i=1
(yn 1)2 (1 + sin2 (2yn )), [-10 10]n SR Levy Function* 30 0
1
yi = 1 + (zi + 1), z = M (x o)
n 4
f18 (x) = (zi2 10 cos(2zi ) + 10), z = M (x o) [-5.12 5.12]n SR Rastrigin Function* 30 0
i=1
n1
f19 (x) = (100(zi+1 zi2 )2 + (zi 1)2 ), z = M (x o) [-30 30]n SR Rosenbrock Function* 30 0
i=1 
n1
f20 (x) =[(n 1) ( yi + sin(50yi0.2 ) yi )]2 ,
i=1
[-100 100]n SR Schaffer Function* 30 0
yi = zi2 + zi+12 , z = M (x o)

+ rand() is a random number uniformly generated in the range [0 1).


* SR stands for Shifted and Rotated. o is a shifting vector and M is a transformation matrix. o and M can be optained from [22].
Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

14

TABLE III
S IMULATION R ESULTS OF SSA

SSA
Function
Mean Std. Best Median Worst
f1 1.6186e-32 1.0517e-32 5.1057e-35 1.5297e-32 3.7156e-32
f2 1.3793e-17 7.3177e-18 1.1512e-18 1.3011e-17 2.9631e-17
f3 2.3226e-29 1.7107e-29 1.4290e-31 1.9226e-29 5.8041e-29
f4 0.0000e+00 0.0000E+00 0.0000e+00 0.0000e+00 0.0000e+00
f5 4.8837e-03 1.0736e-03 2.4026e-03 5.0144e-03 6.2395e-03
f6 0.0000e+00 0.0000e+00 0.0000e+00 0.0000e+00 0.0000e+00
f7 1.3000e-13 8.3556e-14 1.5399e-14 1.0866e-13 3.3514e-13
f8 1.6209e-19 2.9188e-20 1.0842e-19 1.6263e-19 2.1684e-19
f9 2.1945e-32 5.1712e-33 1.4997e-32 2.1930e-32 3.2715e-32
f10 2.8745e-31 2.2674e-31 1.7195e-32 2.3906e-31 8.1099e-31
f11 4.7916e-30 6.2774e-30 1.8044e-32 2.1485e-30 2.4376e-29
f12 3.2436e+01 4.2338e+00 1.9838e+01 3.2854e+01 3.7810e+01
f13 1.1092e+01 4.7899e+00 1.2542e+00 1.2411e+01 1.7681e+01
f14 4.6586e-03 3.9105e-03 1.8089e-06 3.9915e-03 1.3157e-02
f15 4.4522e+00 7.8705e-01 2.1816e+00 4.5635e+00 5.4082e+00
f16 5.9628e-15 2.0881e-15 1.4640e-15 5.8443e-15 9.5542e-15
f17 5.2174e-24 5.7738e-24 3.3998e-27 3.0967e-24 2.2254e-23
f18 5.7430e+01 8.1881e+00 3.2834e+01 5.9698e+01 6.7658e+01
f19 2.3049e+01 2.1401e+00 1.5032e+01 2.3821e+01 2.5149e+01
f20 2.1660e-02 1.2571e-02 2.7821e-04 2.1166e-02 4.5124e-02

TABLE IV
C OMPARISON OF SSA AND OTHER A LGORITHMS

Function SSA GA PSO DE ABC FA CS GSO p-value+


Mean 1.6186e-32 6.8150e-09 8.1429e-23 5.7379e-23 6.9435e-16 7.6896e-04 3.2867e-09 1.6481e-09
f1 9.2703e-163
Std. Dev. 1.0517e-32 2.6344e-09 8.1835e-23 5.4623e-23 1.2791e-16 1.6383e-04 1.8255e-09 4.1349e-09
Mean 1.3793e-17 5.4192e-02 4.8868e-07 1.1678e-11 1.7334e-15 2.0743e-02 2.2554e-03 9.9347e-06
f2 2.1131e-153
Std. Dev. 7.3177e-18 8.8430e-02 4.9112e-07 5.9390e-12 2.4011e-16 1.4435e-02 1.0986e-03 8.7398e-06
Mean 2.3226e-29 3.7126e-01 1.0779e+03 4.0695e-18 6.8511e-16 9.1086e+01 2.5335e-04 1.6283e+00
f3 4.7705e-160
Std. Dev. 1.7107e-29 7.2797e-01 1.0833e+03 4.3340e-18 1.1948e-16 2.6942e+01 1.4650e-04 3.0962e+00
Mean 0.0000e+00 6.1000e-01 3.3002e-01 0.0000e+00 0.0000e+00 0.0000e+00 0.0000e+00 4.0000e-02
f4 9.8282e-50
Std. Dev. 0.0000e+00 7.8607e-01 3.3166e-01 0.0000e+00 0.0000e+00 0.0000e+00 0.0000e+00 1.9596e-01
Mean 4.8837e-03 1.9698e+00 2.5199e+01 6.3510e-03 1.7375e-01 1.6558e-02 1.7591e-02 6.8862e-02
f5 3.5112e-155
Std. Dev. 1.0736e-03 8.2113e-01 2.5324e+01 1.7256e-03 3.2898e-02 1.4563e-02 4.8188e-03 3.4288e-02
Mean 0.0000e+00 3.8550e+00 4.6844e-01 0.0000e+00 4.4409e-18 6.3255e-01 5.2528e-01 9.4351e-03
f6 1.6954e-140
Std. Dev. 0.0000e+00 2.0668e+00 6.1064e-01 0.0000e+00 2.1756e-17 7.8789e-01 5.6932e-01 6.5779e-02
Mean 1.3000e-13 5.6722e-05 2.3919e-09 2.3620e-12 5.8371e-14 6.6434e-03 1.0809e+00 9.8148e-06
f7 2.9529e-158
Std. Dev. 8.3556e-14 1.2822e-05 2.4039e-09 1.0608e-12 8.6788e-15 7.6209e-04 8.6267e-01 1.2967e-05
Mean 1.6209e-19 3.7119e-10 8.1129e-03 2.2188e-04 3.2225e-11 1.8924e-03 1.0494e-04 2.6703e-02
f8 8.5670e-126
Std. Dev. 2.9188e-20 2.2458e-10 8.1534e-03 1.2617e-03 2.3664e-10 8.2919e-04 2.0042e-04 2.7549e-02
Mean 2.1945e-32 4.5492e-10 2.9160e-16 2.8334e-25 6.6181e-16 6.1718e-07 1.4246e-03 6.3460e-13
f9 3.3122e-163
Std. Dev. 5.1712e-33 2.2631e-10 2.9305e-16 3.0374e-25 9.3620e-17 1.4538e-07 1.1765e-02 1.1462e-12
Mean 2.8745e-31 7.4230e-03 2.2047e-02 6.1402e-24 2.1855e-12 3.0842e-04 2.2907e-04 6.6592e-04
f10 6.0028e-111
Std. Dev. 2.2674e-31 9.1295e-03 2.2157e-02 5.3827e-24 7.7601e-12 1.5625e-03 2.6535e-04 2.6358e-03
Mean 4.7916e-30 7.2568e-03 4.3000e-01 5.2051e-24 6.6371e-16 3.0179e-06 2.0087e-01 3.0510e-12
f11 3.6750e-162
Std. Dev. 6.2774e-30 2.6451e-02 4.3215e-01 5.0195e-24 1.1808e-16 7.1811e-07 2.3755e-01 7.1578e-12
Mean 3.2436e+01 7.6512e+00 7.5745e+00 1.6225e+02 2.2382e-15 3.2943e+01 6.4983e+01 1.8929e+00
f12 7.2468e-157
Std. Dev. 4.2338e+00 3.0786e+00 7.6123e+00 1.1805e+01 6.9420e-15 1.1565e+01 9.2798e+00 1.5826e+00
Mean 1.1092e+01 1.4781e+01 2.8118e+01 1.6402e+01 1.5035e+00 5.9410e+02 1.2948e+08 5.6297e+01
f13 2.5420e-127
Std. Dev. 4.7899e+00 2.2299e+01 2.8258e+01 1.2695e+01 2.0043e+00 1.6087e+03 1.0120e+09 3.7568e+01
Mean 4.6586e-03 3.1691e-05 6.5088e-02 6.5491e+00 1.8088e-03 8.1934e-03 9.4693e-02 1.1056e-03
f14 2.2519e-126
Std. Dev. 3.9105e-03 6.6392e-05 6.5412e-02 1.1718e+00 1.0109e-03 2.1510e-02 2.8809e-02 2.2111e-03
Mean 4.4522e+00 3.1990e+01 6.5780e-01 2.0712e-09 2.0378e+01 1.9008e-02 6.0074e+00 1.6142e+01
f15 5.7683e-155
Std. Dev. 7.8705e-01 1.2523e+00 9.8808e-01 2.4517e-09 2.0727e+01 1.4320e-02 5.3251e+00 1.4919e+01
Mean 5.9628e-15 2.6692e+00 1.0432e-02 7.3960e-05 7.3583e-11 1.8573e-03 1.0367e-03 8.2177e-03
f16 1.4636e-128
Std. Dev. 2.0881e-15 4.5269e-02 1.6043e-02 7.3590e-04 2.1860e-10 1.3790e-03 1.3265e-03 1.2332e-02
Mean 5.2174e-24 7.0841e+00 2.7163e-02 8.6993e-23 3.5888e+01 6.2336e-07 1.7457e+01 7.2391e-01
f17 3.9711e-160
Std. Dev. 5.7738e-24 7.2264e-05 1.5446e-01 9.9782e-23 5.1643e+00 1.3662e-07 6.8445e+00 1.8788e+00
Mean 5.7430e+01 4.2661e+02 6.5863e+01 1.8324e+02 1.6630e+02 3.0725e+01 1.1532e+02 7.7452e+01
f18 1.1481e-153
Std. Dev. 8.1881e+00 8.3767e+01 2.6207e+01 9.9285e+00 1.4618e+01 1.1185e+01 1.8382e+01 2.0511e+01
Mean 2.3049e+01 1.1908e+06 3.0758e+02 1.1821e+03 2.6697e+01 1.4113e+03 5.1723e+07 8.0138e+02
f19 2.4963e-109
Std. Dev. 2.1401e+00 5.8639e+04 7.0726e+02 3.3350e+03 2.6906e+00 2.8898e+03 4.9086e+08 1.5719e+03
Mean 2.1660e-02 1.7862e+00 2.1125e-01 8.3706e+00 2.6016e-01 8.3555e-03 3.7660e-01 2.9258e-01
f20 2.0179e-138
Std. Dev. 1.2571e-02 5.6554e-01 4.1501e-01 1.3529e+00 9.7776e-02 2.0417e-02 1.3835e-01 3.5404e-01
+ This p-value is generated by the Kruskal-Wallis one-way analysis of variance.
Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

15

TABLE V
PAIR - WISE W ILCOXON S IGNED R ANK T EST R ESULTS

Function Wilcoxon Signed Rank Test Order


f1 PSO < SSA < DE < ABC < GSO < CS < GA < FA
f2 SSA < ABC < PSO DE < GSO < CS < FA < GA
f3 SSA < DE < ABC < CS < GA < GSO < FA < PSO
f4 ABC CS DE FA SSA GSO PSO < GA
f5 SSA < DE < CS FA < GSO < ABC < GA < PSO
f6 ABC DE SSA < GSO < CS FA PSO < GA
f7 ABC PSO < SSA < DE < GSO < GA < FA < CS
f8 DE < SSA < ABC < GA < CS < FA < PSO < GSO
f9 SSA < PSO < DE < ABC < GSO < GA < FA < CS
f10 SSA < DE < ABC < GSO < FA < CS < GA PSO
f11 SSA < DE < ABC < GSO < GA < FA < CS < PSO
f12 ABC < GSO < GA < FA SSA < PSO < CS < DE
f13 ABC < GA SSA < DE < PSO < GSO < FA < CS
f14 GA < GSO < ABC < FA SSA < PSO < CS < DE
f15 DE < FA < PSO < CS SSA < GSO ABC < GA
f16 DE < SSA < ABC < CS < FA < GSO PSO < GA
f17 PSO < SSA < DE < FA GSO < GA < CS < ABC
f18 FA < PSO SSA < GSO < CS < ABC < DE < GA
f19 SSA < ABC < DE FA GSO PSO < CS < GA
f20 FA < SSA < PSO < ABC GSO < CS < GA < DE

TABLE VI
R ANK S UMMARY OF S TATISTICAL A SSESSMENT R ESULTS

Function SSA GA PSO DE ABC FA CS GSO


f1 2 7 1 3 4 8 6 5
f2 1 8 3.5 3.5 2 7 6 5
f3 1 5 8 2 3 7 4 6
f4 4 8 4 4 4 4 4 4
f5 1 7 8 2 6 3.5 3.5 5
f6 2 8 6 2 2 6 6 4
Sum 11 43 30.5 16.5 21 35.5 29.5 29
Order SSA < DE < ABC < GSO < CS < PSO < FA < GA
f7 3 6 1.5 4 1.5 7 8 5
f8 2 4 7 1 3 6 5 8
f9 1 6 2 3 4 7 8 5
f10 1 7.5 7.5 2 3 5 6 4
f11 1 5 8 2 3 6 7 4
f12 4.5 3 6 8 1 4.5 7 2
f13 2.5 2.5 5 4 1 7 8 6
f14 4.5 1 6 8 3 4.5 7 2
Sum 19.5 35 43 32 19.5 47 56 36
Order ABC SSA < DE < GA < GSO < PSO < FA < CS
f15 4.5 8 3 1 6.5 2 4.5 6.5
f16 2 8 6.5 1 3 5 4 6.5
f17 2 6 1 3 8 4.5 7 4.5
f18 2.5 8 2.5 7 6 1 5 4
f19 1 8 4.5 4.5 2 4.5 7 4.5
f20 2 7 3 8 4.5 1 6 4.5
Sum 14 45 20.5 24.5 30 18 33.5 30.5
Order SSA < FA < PSO < DE < ABC < GSO < CS < GA
Total SSA GA PSO DE ABC FA CS GSO
Sum 44.5 123 94 73 70.5 100.5 119 95.5
Order SSA < ABC < DE < PSO < GSO < FA < CS < GA
Technical Report No. TR-2003-004, Dept. of Electrical & Electronic Engineering, The University of Hong Kong, Oct 2013.

16

TABLE VII
SSA PARAMETER S ENSITIVITY A NALYSIS R ESULTS

Parameter popSize
ra rj 20 30 40 50 60 70 80 90 100 150 200 300
0.01 -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/-
0.05 +/-/- +/-/- +/-/+ +/-/+ +/-/+ +/-/+ -/-/+ -/-/- -/-/- -/-/- -/-/- -/-/-
0.1
0.1 -/-/- -/-/- -/+/+ -/-/+ -/+/+ -/-/- -/+/- -/-/- -/-/- -/-/- -/-/- -/-/-
0.3 -/-/- -/-/- -/+/- -/-/+ -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/-
0.01 -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/-
0.05 +/-/- +/-/- +/-/+ +/+/+ +/-/+ +/-/+ -/-/- -/-/+ -/-/- -/-/- -/-/- -/-/-
0.5
0.1 -/-/- -/-/- -/+/+ -/+/+ -/+/+ -/+/- -/+/- -/-/- -/-/- -/-/- -/-/- -/-/-
0.3 -/-/- -/+/- -/+/- -/-/+ -/+/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/-
0.01 -/-/- -/-/- -/-/- +/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/- -/-/-
0.05 +/-/- +/+/- +/+/+ +/+/+ +/+/+ +/+/+ -/+/+ -/+/+ -/+/+ -/-/- -/-/- -/-/-
0.9
0.1 +/-/- -/+/- -/+/+ +/+/+ +/+/+ -/+/- -/+/- -/-/- -/-/- -/-/- -/-/- -/-/-
0.3 -/-/- -/+/- -/+/- -/+/+ -/+/- -/-/- -/-/+ -/-/- -/-/- -/-/- -/-/- -/-/-

TABLE VIII
C OMPARISON OF SSA WITH SSA- VBR AND SSA- JMP

Function SSA SSA-vbr SSA-jmp Function SSA SSA-vbr SSA-jmp


Mean 1.6186e-32 7.5982e-30 1.0247e+04 Mean 4.7916e-30 1.0367e-03 5.1778e+06
f1 Std. Dev. 1.0517e-32 4.8058e-29 3.9729e+03 f11 Std. Dev. 6.2774e-30 1.0315e-02 4.4233e+06
p-value+ - 1.2343e-15 3.8939e-18 p-value+ - 8.7411e-17 3.8941e-18
Mean 1.3793e-17 2.9822e-02 7.1596e+01 Mean 3.2436e+01 4.6690e+01 2.2035e+02
f2 Std. Dev. 7.3177e-18 2.9672e-01 2.1250e+01 f12 Std. Dev. 4.2338e+00 1.0501e+01 3.5697e+01
p-value+ - 5.9886e-17 3.8955e-18 p-value+ - 2.6788e-16 3.8928e-18
Mean 2.3226e-29 3.2121e+00 1.0240e+08 Mean 1.1092e+01 3.8639e+01 1.2318e+09
f3 Std. Dev. 1.7107e-29 5.2661e+00 3.8158e+07 f13 Std. Dev. 4.7899e+00 2.5945e+01 8.6285e+08
p-value+ - 3.8963e-18 3.8933e-18 p-value+ - 1.9045e-16 3.8963e-18
Mean 0.0000e+00 0.0000e+00 1.0359e+04 Mean 4.6586e-03 1.4267e-01 8.1084e+00
f4 Std. Dev. 0.0000e+00 0.0000e+00 3.4085e+03 f14 Std. Dev. 3.9105e-03 2.2850e-01 3.5511e+00
p-value+ - 1.0000e+00 3.8902e-18 p-value+ - 2.3413e-17 3.8966e-18
Mean 4.8837e-03 1.5121e-02 5.4745e+00 Mean 4.4522e+00 9.7472e+00 8.4798e+01
f5 Std. Dev. 1.0736e-03 5.3973e-03 2.9087e+00 f15 Std. Dev. 7.8705e-01 1.2617e+01 1.2918e+01
p-value+ - 4.5308e-18 3.8966e-18 p-value+ - 1.0580e-15 3.8966e-18
Mean 0.0000e+00 8.5475e-03 2.7865e+04 Mean 5.9628e-15 1.0873e-02 9.7567e+01
f6 Std. Dev. 0.0000e+00 5.7840e-02 9.3677e+03 f16 Std. Dev. 2.0881e-15 1.2371e-02 3.4543e+01
p-value+ - 3.1250e-02 3.8933e-18 p-value+ - 7.1167e-18 3.8924e-18
Mean 1.3000e-13 7.2151e-08 1.9004e+01 Mean 5.2174e-24 1.8819e+00 4.6920e+01
f7 Std. Dev. 8.3556e-14 6.2557e-07 1.4653e+00 f17 Std. Dev. 5.7738e-24 4.5739e+00 1.4798e+01
p-value+ - 1.3103e-14 3.8913e-18 p-value+ - 1.3975e-17 3.8950e-18
Mean 1.6209e-19 2.9428e-02 9.2798e+01 Mean 5.7430e+01 1.0433e+02 2.4612e+02
f8 Std. Dev. 2.9188e-20 3.4996e-02 3.2780e+01 f18 Std. Dev. 8.1881e+00 2.1145e+01 3.8021e+01
p-value+ - 1.1166e-15 3.8937e-18 p-value+ - 3.8898e-18 3.8900e-18
Mean 2.1945e-32 1.0441e-29 4.3036e+01 Mean 2.3049e+01 3.1019e+01 1.2359e+09
f9 Std. Dev. 5.1712e-33 7.0947e-29 1.7863e+01 f19 Std. Dev. 2.1401e+00 1.3710e+01 8.8791e+08
p-value+ - 5.1617e-16 3.8948e-18 p-value+ - 2.9509e-15 3.8966e-18
Mean 2.8745e-31 8.0267e-27 2.5753e+07 Mean 2.1660e-02 6.2229e-01 9.2919e+00
f10 Std. Dev. 2.2674e-31 7.5469e-26 2.0365e+07 f20 Std. Dev. 1.2571e-02 6.8248e-01 3.9432e+00
p-value+ - 1.2371e-16 3.8946e-18 p-value+ - 5.9433e-18 3.8966e-18
+ This p-value is generated by the the Wilcoxon signed rank test.