Vous êtes sur la page 1sur 6

Proceedings of the 2006 IEEE

International Conference on Information Acquisition


August 20 - 23, 2006, Weihai, Shandong, China

An AND-OR Fuzzy Neural Network and Ship Application


Jianghua Sui Guang Ren
Marine Engineering College Marine Engineering College
Dalian Maritime University Dalian Maritime University
Dalian, Liaoning Province, 116026, China Dalian, Liaoning Province, 116026, China
suijhg tom.com renggdlmu.edu.cn
Abstract - A novel multilayer feed-forward AND-OR fuzzy equivalence is proved between the architecture of AND-OR
neural network (AND-OR FNN) and a piecewise optimization FNN and the fuzzy weighted Mamdani inference in order to
approach are proposed in this paper. The equivalent is proved utilize the AND-OR FNN to optimize fuzzy rules. The
between the architecture of AND-OR FNN and fuzzy weighted piecewise optimization of AND-OR FNN consists of two
Mamdani inference. The main superiority is shown in not only phases, first the blueprint of network is reduced by GA and
reducing the input space by special inner structure of neurons, P
but also auto-extracting the rule base by the structure PA; the second phase, the parameters are refinedby ACS (Ant
optimization of network. The optimization procedure consists of Colony System). Finally this approach is applied to design
two phases, first the blueprint of network is reduced by GA AND-OR FNN ship controller. Simulating results show the
(Genetic Algorithm) and PA (Pruning Algorithm); the second performance is much better than ordinary fuzzy controller.
phase, the parameters are refined by ACO (Ant Colony 1. Fuzzy NEURONS AND ToPOLOGY OF AND-OR FNN
Optimization). The AND-OR FNN ship controller system is
designed based on input-output data to validate this method. A. Logic-driven AND and ORfuzzy neurons
Simulated results demonstrate that the number of rule base is The AND and OR fuzzy neurons are two fundamental
decreased remarkably, the performance is much better than classes of logic-driven fuzzy neurons. The basic formulas
ordinary fuzzy control and the approach is practicable, simple governed the functioning of these elements are constructed
and effective.
with the aid of T norm and S norm (see Fig. 1, Fig. 2). Some
I. INTRODUCTION definitions of the double fuzzy neurons show their inherent
Fuzzy neural network combines the theories of fuzzy capability of reducing the input space.
logical and neural network, including learning, association, Definition 1. Let X =[XIX2...... Xn] be input variables and
identification, self-adaptation and fuzzy information process. W =[w1,W2 ....... wn ] be adjustable connections (weights),
The logic neurons have received much concern all the time as which of them are confined to the unit interval. Then,
the important components of neural networks. From the n
models designing to the algorithms studying, there are many y= T(w1Sx1) (1)
achievements. Glorennec [1] proposed a general artificial is the AND fuzzy neuron, which completes a T-S norm
neuron to realize Lukasiewicz logical operate. Yager [2] composition operators shown in Fig. 1.
employed a group of OWA fuzzy aggregation operators to Definition 2. Let X .[X ,. Xn] be input variables and
form OWA neuron. Pedrycz and Rocha [3] proposed
aggregation neurons and referential neurons by integrating W =[wI,W2 .......wn] be adjustable connections (weights),
fuzzy logic and neural network and discuss the relation about which are confined to the unit interval. Then,
the ultimate etwork structure and practical problem. Pedrycz n
[4-5] constructed a knowledge-based network by AND, OR S(wiTx) (2)
neurons to solve classified problem and pattern recognition. is the OR fuzzy neuron, which completes an S norm and T
Bailey [6] extended the single hidden layer to two hidden norm composition operators shown in Fig.2.
layers for improving complex modeling problems. Pedrycz AND and OR neurons both are the
and Reformat designed fuzzy neural network constructed by mapping [0, 1]n _> [0,1], the neuron expression is shown by
AND, OR neurons to modeling the house price in Boston [7]. Zadeh operators as (3) and (4).
We consider this multi-input-single-output (MISO) fuzzy A
logic-driven control system based on Pedrycz. Pedrycz [7] T -
transformed T norm and S norm into product and probability
operators, formed a continuous and smooth function to be T
optimized by GA and BP. But there is no exactly symbolic S,
expression for every node, because of the uncertain structure. t /, t
In this paper, the AND-OR FNN is firstly named as AND-OR W1/W2 \Wn W//2N
fuzzy neural network, the in-degree and out-degree of neuron x1/ I2 x 1 2 Xf
and the connectivity of layer are defined and Zadeh operators
are employed in order to infer the symbolic expression of Fig. 1 AND neuron. Fig. 2 OR neuron.
every layer and form a continuous and rough function. The

1 -4244-0529-7/06/$20.OO ©2006 IEEE


605
y - A\WiVV Xi (3) XIg

y - V(Wi AXi) (4)


Owing to the special compositions of neurons, for binary *7
inputs and connections, the neurons' function is equal to the Zd
standard gates in digital circuit. For AND neuron, the closer to Xn
0 the connection w, is, the more important to the output the n

corresponding input x is. For OR neuron, the closer to 1 the input fuzzification AND OsR defzzification
connection w. , the more important to the output the
t ~~~~~~~~~~~~~~~~~~~Fig.
3 Architecture of AND-OR FNN.
corresponding input xi is. Thus the values of connections Both W and V are double connection matrixes, also
become the criterion to eliminate the irrelevant input variables imply relevant degree matrix (RDM) like introduce above.
to reduce input space. Vector H is the membership function of consequents. The
B. Several notations about AND, OR neuron number of neurons in every layer is n , n x t, m, s and 1,
Definition 3. Let wi be the connection value, xi be the ith respectively (t is the number of fuzzy partitions).
input variable. Then, Definition 6. The connectivity of layer is the maximum in-
RD(xi) = wi E [0,1] (5) degree of every neuron in this layer, including the double
is the relevant degree between the input xi and the neuron's hidden layers,
output. For AND neuron, if RD(x1) = 0, then xi is more Con(AND) = max(d+(ANDi)) (6)
important feature to the output; if RD(x1) 1, then xi is more Con(OR) = max(d+(ORj)) (7)
irrelevant feature to the output, it can be cancelled. For OR where AND. is the ith AND neuron, d+(AND1) is the in-
neuron, vice versa. Thus the RD V and RDM is the vector and degree of the ith neuron. OR is the ith OR neuron,
matrix made up of connections, respectively, according to the
dimensions of X, which has been become the threshold to d+ (ORj) is the in-degree of the ith neuron.
obstacle some irrelevant input variables and reduce the input Remark 1. Con(AND) < n ( n is the number of input
space. .
ef
Defilnition 4. In-degree
I iSi from directed graph in graph variables). Con(OR) < m (
ners)
m is the number of AND
theory; the neural network is one of directed graph. The in- neurons).
degree of the ith AND neuron d + (AND.
degree) isofteihADerndA
shown the number
ssonhubr D The exact expression of every node in AND-OR FNN
of his input variables; the in-degree of the jth OR neuron
According to definitions and the physical structure
background, the AND-OR FNN model is derived as Fig. 3.
d+ (ORj) is shown the number of his input variables. The functions of the nodes in each layer are described as
Definition 5. The out-degree of the ith AND neuron follows.
d- (ANDi) is the number of his output variable; the out- Layer 1 (input layer): For every node i in this layer, the
degree of the jth OR neuron d (OR1 ) is shown the number of input and the output are related by
his output variables. = (8)
C The architecture ofAND-OR FNN where Of denotes the output of the ith neuron in layer 1,
This feed-forward AND-OR FNN consists of five layers i = 1,2, ,n . Here the signal only transfers to the next layer
(Fig.3. MISO case), i.e. the input layer, fuzzification layer, without processing.
double hidden layers (one consists of AND neurons, the other Layer 2 (fuzzification layer): In this layer, each neuron
consists of OR neurons) and the defuzzification output layer, represents the membership function of linguistic variable.
and corresponding to the four parts (fuzzy generator, fuzzy Gaussian is adopted as membership function. The linguistic
inference, rule base and fuzzy eliminator) of FLS (Fuzzy value (small,, very large) A, are used. The function is
Logic System), respectively. Here the fuzzy inference and the shown as.
fuzzy rule base are integrated into the double hidden layers. show as2
The inference mechanism behaves as the inference function of -AJ(x me(9
the hidden neurons. Thus the rule base can be auto-generated (9)
by training AND-OR FNN\in virtue of input-output data. where i=1,2, ..n, j=1,2, .t , mijand ufare the modal and
spread of the jth fuzzy partition from the ith input variable.
Layer 3 (AND hidden layer): This layer is composed of
AND neurons. Based on definitions above, the function can
be expressed as follows.

606
d+ (ANDk ) d+ (ANDk) where xl,--, x, are the input variables, y is the output, A
0>k T
PI= (wO) A
PI
UA
(r
(Wk V Pij (1)
(Xi)) (10) and B are the fuzzy sets of input and output, respectively. wj
where k =1,2,-2 ,tt, j 1,2, --,t i =1,2, -,n r =(i-1)xt is local weights of the antecedent part; vl,-,vs is the class of
+j, d+(ANDk) is the in-degree of the kth AND neuron. tn is global weight for every rules. i =1,2,- -, n , j =1,2,- - ,t
the total of AND neurons in this layer. I = 1,2,, s , t is the total of antecedent fuzzy sets; s is the
Note 2. If p, is fixed and d+(ANDi) . 2, i must be different. total of consequent fuzzy sets. The same B1 is collected to
That means the input of the same AND neuron must be from form s complex rules as follows:
the different xi Rule 1: if xi is Aji,and and Xn is Ani,, then y is B,
Layer 4 (OR hidden layer): This layer is composed of or-- -or if xi is Alj, and-- -and xn is Anj, theny is Bl,,---
OR neurons. Based on above definitions, the function can be
expressed as follows. IRule S: if x1 is Akand --and x is Ank then y is Bs
d+(OR,) d+(ORi) or -or if xi isAn,, and .and Xn is Anl, theny is Bs
0> = S (vTO3)
P2='
v
P2='(/
I
(Vlk A O)(5 (1) (15)
where k =l,2,Xtn / =l,2,**,s d +(OR) is the in-degree of where i ', i ", j ', j", k', k",',I" E [1, t] .
the lt OR neron.sis thetotal f OR nuron.the firing On the view of general fuzzy inference, to a single rule,
( weight) of
strength (or i rule iS usually obtained
o ith
Layer 5 (defuzzification layer): There is only one node in . min or .s
this layer, which includes forward compute and backward by multiplication operator, to the complex rule, the
training. Center-of-gravity method is adopted for the former firing strength A 1is the union from all single weights, which
compute as follows. can be expressed as follows.

z= 05 = (12) Al= I(1UA,i,X ,Ani, (XI, SXn)1.. PAjIx .. xAn, (xI,- -Xn))
I

where 1,2,
-2 - , s , hI, is the membership function of Al j( (X-) UA(Xn))
A AP n, (16)
consequents. The latter only imports the training data for = V

directions are time-sharing. The function can be expressed as PP2=' ni


follows. where A, is the number of the same consequent B1, Ti is the
-(zd -mj)2 number of antecedent parts in a rule.
°5 =/1B (Zd) e
2
(13) On the view of fuzzy weighted Mamdani inference, local
weight and global weight are considered by a mode of
III. FUNCTIONAL EQUIVALENT BETWEEN THE < V, A > as follows:
FUZZY WEIGHTED MAMDANI INFERENCE AND THE
AND-OR FNN VI A i 'H
This section demonstrates the functional equivalence
A
Al VJPAni, (xn) )
yPI nWi (
between the AND-OR FNN and the fuzzy weighted Mamdani } Kl (WJJ V/IA (1x,--- (17)
inference system, though these two models are motivated VA, P l=1 V -(X) ))
A A X
from different origins (AND-OR FNN is from physiology and Wnjp Anj
fuzzy inference systems is from cognitive science), the In general, center-of-gravity method is adopted for
functional equivalent under minor restrictions is illustrated defuzzification as follows.
A FuzzyweightedMamdani inference s

The fuzzy weighted Mamdani inference system [9] Al PB, (Y)


utilizes local weight and global weight to avoid a serious y 1=1 (18)
shortcoming in that all propositions in the antecedent part are
assumed to have equal importance, and that a number of rules /=1
executed in an inference path leading to a specified goal or the B Functional equivalence and its implication
same rule employed in various inference paths leading to From (8) and (18), it is obvious that the functional
distinct final goals may have relative degrees of importance. equivalence between an AND-OR FNN and a fuzzy weighted
Assume the number of the fuzzy IF-THEN rules with Mamdani inference can be established if the following is true.
consequent y is B1 , then fuzzy rules can be represented as: 1) The number of AND neurons is equal to the number
Rule:ifx1 is Al.and,--and x is A.. of fuzzy IF-THEN rules.
and--,ad ~ i A1 ten (14) 2) The number of OR neurons is equal to the number of
and ,ad n i An ten i B1 fuzzy complex IF-THEN rules.

607
3) The T-norm and S-norm used to compute each rule's
firing strength is min and max, respectively.
initialization, RDM(w,) 0 and RDM(vj) , the status of

4) Both the AND-OR FNN and the fuzzy inference AND-OR FNN is completely connected. The authoritative
system under consideration use the same center-of-gravity opinion from experts can be put into initial population as
method to derive their overall outputs. seeds to reach the best initialization.
Under these conditions, when RDM(wj) = 0 and 2) Objectivefunction
RDM(v ) =1, the AND-OR FNN is completely connected, Proceeding with the numeric data, we carry the genetic
IJ ~~~~~~~~~~~~~optimizationof the network. The parameters of the GA canbe be
chosen exerientally The fitness function
ch on . The
which is shown every part of antecedent is the same important experimentally. function maximized
maximized by the
to the output,toWkte =n, / l= nxtt and
ouput,TA n
equal to th
nd eual o general
the genral GA is expressed in the form of the sum of squared errors
fuzzy inference. But it falls short of the practice. When between target values (data) and outputs of network. This
wi e [0,1] , vi e [0,1], every part of antecedent has different fitness has to be minimized (it stands in contrast with the
important to the output and every rules has different important classic way of using fitness functions in GA whose values are
to the final output. maximized; if necessary a simple transformation of taking the
IV. OPTIMIZATION FOR THE AND-OR FNN reciprocal of the fitness, 1I/ fitness, brings in line with the
way of maximizing the fitness).
In general, gradient-based optimization is often chosen 1
for ANN, Pedrycz [8] has proved that the output of AND fitness (21)
neuron and its derivative are heavily affected by the ( (zd ) )
increasing values of " n ". Meanwhile, in this paper the final b. Pruning algorithm
output of AND-OR FNN is not a smooth function. Obviously The pruning of the network is a process consisting of
gradient-based optimization is not preferred, A far more removing unnecessary parameters and nodes during the
comprehensive optimization policy is proposed, which is a training process of the network without losing its
hybrid learning methodology comprising of three fundamental generalization capacity. The best architecture has been sought
techniques, namely genetic optimization, pruning algorithm using GA in conjunction with pruning algorithms. There are
and ant colony system. three situations as follows:
A Structure optimization 1) For AND neuron, if the connection is equal or close
The key part is the structure design to solve the problem to zero, this means that the corresponding input impacts the
using network, the AND-OR FNN could be over output of this neuron, the connection can be pruned away if its
parameterized, and in this case, they produce an over trained value is close to one. For the OR neuron a reverse situation
net, which leads to worse generalization capacity. That is the occurs: the connection close to one implies that the specific
reason why, in order to eliminate unnecessary weights or input is essential and affects the output of the neuron. The
parameters. There are no certain methods to solve the connection can be pruned away if its value is close zero.
structure optimization. In this paper, GA (Genetic Algorithm) 2) For RDM(AND) , the number of zero in every line is
and PA (Pruning Algorithm) are adopted to optimize the shown as the in-degree of this AND neuron, the node can be
structure for avoiding local optimization and reducing the pruned away if the line is full of one. For RDM(OR) , the
connections. number of one in every line is shown as the in-degree of this
a. GA optimization OR neuron, the node can be pruned away if the line is full of
GA ( Genetic Algorithm) is a very effective method to zero.
solving the structure optimization problem, the superiority is 3) For RDM(AND) , if there are the same line in the
robust, random, global and parallel process [9]. Structure matrix, that means the node is redundant, can be pruned away.
optimization is transferred to the species evolution by GA to For RDM(OR) , that means the nodes are conflict, need be
obtain the optimal solution. ,,
1) Structure code mode pue wy
Parameters encoding is the first phase of GA B. Ant colony optimization as parametric refinement
optimization. Parameters are often transformed to unsigned As well known, most neural networks are trained using
binary. Here the structure parameters of two hidden layers are BP algorithm, whose main drawbacks are easily getting stuck
consid Here on focus bye mat or as follows:
requires the output needing function longer training
in local minim and time. And it also
conu
d l
mtri
anfd]ll is smooth, v, A operators can't
RDM(AND) LWi,j imx(nxt) ( 9) satisfy at all. ACO (Ant colony optimization) algorithm [10]
RDM(OR) =FV0r. 1 (20) was adopted to optimize the connection value of AND-OR
L"J lsxm FNN here. ACO algorithm was proposed by Dorigo et al.
where i, j, m, n, t and s is the same like before. Lining up the (1991). The main features are high-precision solution, fast
twvo matrixes to a binary string as initial population represents search speed, convergence to global optimum and greedy
heuristic search. The basic idea iS: assuming that there are m
the structure status, which is exactly satisfied to GA. In the parameters consist of the connections from GA and PA. First,

608
all the parameters are sequenced and set to N randomly values
in [0, 1], and for the parameter p (I < i < m), compose a set, d(k)
denoted as And then the number of ants is set to h, they
start from the random connection to search the optimal A
solution, choosing next connection from every set NN k,d
according to the amount of pheromones as follows: - r(k) utopi(ot + 1)
J ))
Pr ob(r< (I i
=
N
(I ) (22)
Z Tg (Ipi) Delay operator
g=1
where r1j (I'n )Piis shown the amount pheromones of the jth Fig. 4 AND-OR FNN autopilot for ship course-keeping.
method has been provided for the improvement of ship
connection pj (IP)i steering control.
When an ant finishes choosing the connection in all the Double input variables of the AND-OR FNN ship
sets, it arrives at the food source, and returns its nest through controller are heading error A v = V(k + 1) - yi(k) and the yaw
the paths just gone by it, and then adjust the amount of rate y(k)= dil/ dt . The control action generated by the
pheromones corresponding to the chosen connection as follow autopilot is the command rudder angle 3(k) . The range of
rules.
values (universe of discourse) for a given autopilot inputs and
rj (Ip )(t+ m)= p-a (Ip )(t) irj(pi
+
output are (-200,200) and (-2.5° / sec, 2.5°/ sec), respectively.
hy(Ip) k =
(Ip3)
It is usually required that the rudder should move
Zk=1 from35° port to 35° starboard within 30s.
where p(O < p < 1) is shown the evaporating of pheromone, A. The source ofthe experimental data set
<Ti (IPi ) is the kth ant leave pheromones on the jth connection In order to obtain integrated and exact training data, a
fuzzy control system for a tanker is simulated with nonlinear
in set It is described as follows: Nomoto model as controlling plant. The input gains of a fuzzy
controller (K=0.16sec-1,
ifpjJ (IPi(24)
) is selected includes double inputs, theT=1O0sec).
The fuzzy controller
r1 (In.) = rQ/ek error in the ship heading A yi(k)
LOi otherwise
otherwise and the change in that error y(k) Every input variable is
IZd
k

where Q is the constant to adjust factor. e = - z is shown fuzzied into 11 triangular membership functions; thus form
the network output error with a group of connection value, 121 pieces of rules. The output of the fuzzy controller is the
which is selected by the kth ant. Obviously the less the error rudder input 3. We assume the tanker as a continuous time
is, the more the increment of the corresponding amount of system controlled by the fuzzy controller, which is
pheromone is. implemented on a digital computer with a sampling interval of
Finally, all the ants find the best optimal solution for the T=0.1sec. From the view of manipulating index, it satisfied
connections value. the index of 150OOtdw tanker [11], a fuzzy closed-loop
V. SHIP CONTROL controlling system is designed with fuzzy rule base from
According to the principle of manual steering, this paper experience. The positive direction angle is defined that the
makes full use of the advantages of fuzzy logic and neural rudder angle goes around toward right. Center-of-gravity is
network, and an AND-OR FNN controller is presented to adopted as defuzzification methods.
adapt the change of navigation environment and get the The reference input heading V'r is increased 45 degree
information of ship maneuverability automatically. The every 2000sec, rapidly. The digital simulation is completed to
structure and parameters of AND-OR FNN controller are closed-loop with the sampling periods h=1Osec, the sampling
learned by GA, PA and ACO. Fuzzy rules can be auto- state variable A y , y and controlled variable 3 , the ship
obtained from a group of sample data, which is generated by a
fuzzy control system. Fig.4 shows a block diagram of the htading and desire si hea isshw insFig. 5tandzwe
AND-OR FNN autopilot. Test results show that an effect obtinth
conventonal
chang
fuzzy oftrud.
control. The rstaseutng
age data sequence

609
VI. CONCLUSIONS
In this paper, we have proposed a novel AND-OR FNN
20 ,RDiA '+i.and a piecewise optimization approach, the symbol
= expressions of every node in the AND-OR FNN are educed in
detail, the input space is reduced by special inner structure of
Tiruw.(sec)
2.ft.wo
.............
lo 35X0
. ..
AND and OR. The equivalence is proved to the fuzzy
Fig. 5 Ship heading (solid) and desired ship heading. weighted Mamdani inference. The fuzzy rule base auto-
B. The structure design and simulation results extracted is equal to optimize the architecture of AND-OR
In this section, an AND-OR FNN ship controller is FNN. This novel approach has been validated using AND-OR
constructed successfully. Firstly, A V(k) and y(k) are double FNN ship controller design. The simulation results illustrate
input variables; 8(k) is the output variable, which are all the approach is practicable, simple and effective and the
fuzzied into 3 Gaussian membership functions with 0.5 performance is much better than ordinary fuzzy controller.
overlap degree, thus there are 9 pieces of rules. According to ACKNOWLEDGEMENTS
analysis before, there are 9 AND neurons and 3 OR neurons
inin~ADO
AND-OR FNT. Intaiesrcueprmtr
FNN. Initialize ihal
structure parameters with all Edcto
This paper is supported
Cmites
by the Doctoral Foundation
203505 Mnsry
off
connections to form the structure string that Education Committees (2003151005) and Ministry of
an

is RDM(w ) = O, RDM(v,) = 1. The input-output data is from Communication of P. R. China (200332922505).


{A v(k), y(k), 8(k), k =1,2,...} to train and test. The structure REFERENCES
and parameters are optimized by GA, PA and ACO, some [1] G. Pierre Yves, "Neuro-Fuzzy Logic," In: Proc. IEEE-
parameters are chosen experimentally, which are including FUZZ., New Orleans, pp. 512-518. 1996.
population size=100, crossover rate=0.7, Mutation rate=0.01 [2] R. Yager, "OWA neurons: A new class of fuzzy neurons,"
and selection process is tournament, and the parameters of In: Proc. IEEE-FUZZ., San Diego, pp. 2316-2340. 1992.
ACO are m 19 , pi E [0,1] , N 30 , p 0.65 , h 30 ,
= = = =
[3] W. Pedrcy and A. F. Rocha.., "Fuzzy-set based models of
Q = 10.The ultimate better structure result is transformed into neurons and knowledge-based networks," IEEE. Trans.
the AND-OR FNN shown in Fig. 6. It is obviously that there on Fuzzy System, vol. 1, no. 4, pp. 254-266, 1993.
are 5 AND neurons and 3 OR neurons left, that is, 5 pieces of [4] K. Hirota and W. Pedrycz, "Knowledge-based networks
fuzzy rules and 3 complex rules. The ultimate better in classification problems," Fuzzy Sets and Systems, vol.
performance result is compared with fuzzy control system. 159, no.3, pp. 271-279, 1993.
Test data set is form the different part with training data in the [5] W. Pedrycz and G. Succi, "Genetic granular classifiers in
same sequence introduce above. Fig. 7 shows the error modeling software quality," The Journal of Systems and
comparison between test results and objective data. The Software, vol. 76, pp.277-285, 2005.
simulation result illustrated the effectiveness of proposed [6] S. A. Bailey and Y. H. Chen, "A two layer network using
method. the OR/AND neuron". In: proc. IEEE-FUZZ., Anchorage,
pp. 1566-1571, 1998.
AV(k)..0C 04 Wp >
/JfQ\\\V
[7] W. Pedrycz and M. Reformat, "Genetically optimized
logic models," Fuzzy Sets and Systems, vol. 150, no. 2,
pp. 351-371, 2005.
[8] D. S. Yeung and E. C. C. Tsang, "Weighted fuzzy
Z/ - -production rules," Fuzzy sets and systems, vol. 88, pp.
~
Y(ki ~ ~ ~ -z 299-313, 1997.
[9] S. L. Zeng, L. J. Song and W. H. Xu, "Neural network
P 5 > structure optimization based on genetic algorithm,"
Journal of Jishou University, vol. 26, no. 3, pp. 118-120,
Fig.6 the AND-OR FNN ship controller structure. 2005.
[10] H. B. Duan, "Ant colony algorithm: theory and
applications," science press, 2005.
[11] G. X. Yang, "Study on ship motion hybrid intelligent
control and its interacting simulation based on virtual
_~v77-7~7 reality," PhD Thesis. Dalian: Dalian Maritime University,

Fig. 7 The performance index compared.

610

Vous aimerez peut-être aussi