Vous êtes sur la page 1sur 11

NEURO CONTROL

AIM :

To design a neural net for a simple transfer function and train the net to get minimum error.

THEORY:
An artificial neural network is a system based on the operation of biological neural networks, in
other words, is an emulation of biological neural system. Artificial neural networks have
different architectures, which consequently requires different types of algorithms. The Artificial
Neural Network is built with a systematic step-by-step procedure to optimize a performance
criterion or to follow some implicit internal constraint, which is commonly referred to as the
learning rule.

MATHEMATICAL MODEL:

Mathematically, this process is described in the figure

The strength of the connection between an input and a neuron is noted by the value of the
weight. Negative weight values reflect inhibitory connections, while positive values designate
excitatory connections .The next two components model the actual activity within the neuron
cell. An adder sums up all the inputs modified by their respective weights. This activity is
referred to as linear combination. Finally, an activation function controls the amplitude of the
output of the neuron. An acceptable range of output is usually between 0 and 1, or -1 and 1.

NEURAL NETWORK TOPOLOGIES

• Feed-forward neural networks: Here the data flow from input to output units is
strictly feedforward. The data processing can extend over multiple (layers of) units, but
no feedback connections are present..
• Recurrent neural networks : These contain feedback connections. Contrary to
feed-forward networks, the dynamical properties of the network are important. In some
cases, the activation values of the units undergo a relaxation process such that the neural
network will evolve to a stable state in which these activations do not change anymore.

TRAINING OF NETS

A neural network has to be configured such that the application of a set of inputs produces the
desired set of outputs. This is called training.We can categorise the learning situations in distinct
sorts. These are:

• Supervised learning : The network is trained by providing it with input and


matching output patterns. These input-output pairs can be provided by an external
teacher.
• Unsupervised learning : Here an output unit is trained to respond to clusters of pattern
within the input.. Unlike the supervised learning paradigm, there is no a priori set of
categories into which the patterns are to be classified; rather the system must develop its
own representation of the input stimuli.
• Reinforcement Learning This type of learning may be considered as an intermediate
form of the above two types of learning. Here the learning machine does some action on
the environment and gets a feedback response from the environment.
ADVANTAGES OF NEURAL NETS:

• A neural network can perform tasks that a linear program can not.
• When an element of the neural network fails, it can continue without any problem
by their parallel nature.
• A neural network learns and does not need to be reprogrammed.
• It can be implemented in any application.
• It can be implemented without any problem.

DISADVANTAGES

• The neural network needs training to operate.


• The architecture of a neural network is different from the architecture of
microprocessors therefore needs to be emulated.
• Requires high processing time for large neural networks.

PROCEDURE:
• Open a model file and connect all the blocks as shown in the diagram.

• Obtain the response.

• Use the arrays from the workspace and create a neural network.

• Plot the closed loop response.

• Plot a comparison of both the neural net and closed loop.

• Using nntool train the net to give the correct response.

• Then export the error of the network and plot it.

CODE
net=newff(in,out,13)

y=sim(net,in)
plot(t,out)

plot(t,out,t,y)

a=in’

b=out’

nntool

plot(network1 errors)

OUTPUT:
Input array a:

>>a=in’

Columns 1 through 12

1 1 1 1 1 1 1 0 0 1 1 0

Columns 13 through 24

0 1 1 1 0 0 0 1 1 1 1 1

Columns 25 through 36

0 0 0 1 1 1 0 0 0 0 1 1

Columns 37 through 48

1 1 0 0 0 1 1 1 1 1 1 0

Columns 49 through 60

0 0 0 1 1 1 1 0 0 0 0 1

Columns 61 through 72

1 1 1 0 0 0 0 0 0 1 1 1

Columns 73 through 84

1 0 0 1 1 1 0 0 0 0 1 1
Columns 85 through 96

1 1 0 0 0 0 1 1 1 1 0 0

Columns 97 through 108

0 1 1 1 1 1 0 0 0 0 1 1

Columns 109 through 120

0 0 0 1 1 0 0 1 1 1 0 0

Columns 121 through 132

1 1 1 1 1 0 0 0 1 1 1 1

Columns 133 through 144

0 0 0 0 1 1 1 0 0 0 0 1

Columns 145 through 156

1 1 0 0 0 0 0 0 1 1 1 0

Columns 157 through 168

0 0 0 0 1 1 1 1 1 0 0 0

Columns 169 through 180

1 1 1 1 1 0 0 0 1 1 1 0

Columns 181 through 192

0 0 0 1 1 1 1 0 0 0 1 1

Columns 193 through 198

1 0 0 0 0 0

Output array:

>> b=out'

b=
Columns 1 through 7

0 0.0002 0.0012 0.0062 0.0309 0.1452 0.1813

Columns 8 through 14

0.1813 0.1640 0.1640 0.3156 0.3156 0.2584 0.2584

Columns 15 through 21

0.3928 0.4506 0.4506 0.4506 0.4077 0.4077 0.4077

Columns 22 through 28

0.4641 0.5151 0.5151 0.5151 0.5151 0.4660 0.4660

Columns 29 through 35

0.4660 0.5169 0.5169 0.4677 0.4232 0.4232 0.4232

Columns 36 through 42

0.5277 0.6133 0.6133 0.6133 0.6133 0.5550 0.5550

Columns 43 through 49

0.5550 0.5973 0.6356 0.7017 0.7301 0.7301 0.7301

Columns 50 through 56

0.6875 0.6606 0.6606 0.6606 0.6804 0.6929 0.6929

Columns 57 through 63

0.6929 0.6525 0.6270 0.6270 0.6270 0.6487 0.6625

Columns 64 through 70

0.6625 0.6625 0.6238 0.5874 0.5208 0.4441 0.4441

Columns 71 through 77

0.4441 0.5054 0.5448 0.5448 0.4930 0.4930 0.4930

Columns 78 through 84
0.5412 0.5412 0.4897 0.4431 0.4010 0.4010 0.4010

Columns 85 through 91

0.5095 0.5095 0.5095 0.5095 0.4172 0.3775 0.3775

Columns 92 through 98

0.3775 0.4903 0.5388 0.5388 0.5388 0.4876 0.4876

Columns 99 through 105

0.4876 0.5363 0.5804 0.5804 0.5804 0.5804 0.4752

Columns 106 through 112

0.4300 0.4300 0.4842 0.4842 0.4842 0.4382 0.4382

Columns 113 through 119

0.4916 0.4916 0.4448 0.4448 0.4448 0.4977 0.4977

Columns 120 through 126

0.4503 0.4503 0.5026 0.5500 0.6315 0.6666 0.6666

Columns 127 through 133

0.5458 0.4938 0.4938 0.4938 0.5234 0.5420 0.5420

Columns 134 through 140

0.5420 0.5104 0.4904 0.4904 0.5202 0.5389 0.5389

Columns 141 through 147

0.5389 0.5075 0.4876 0.4876 0.5175 0.5364 0.5364

Columns 148 through 154

0.5364 0.5051 0.4756 0.4217 0.3974 0.3974 0.3974

Columns 155 through 161

0.4547 0.4547 0.4547 0.4114 0.3723 0.3723 0.3723


Columns 162 through 168

0.3723 0.4861 0.5792 0.5792 0.5792 0.5792 0.5241

Columns 169 through 175

0.5241 0.5241 0.5694 0.6104 0.6104 0.6104 0.6104

Columns 176 through 182

0.5523 0.5523 0.5523 0.5949 0.5949 0.5383 0.4871

Columns 183 through 189

0.4871 0.4871 0.4871 0.5800 0.6200 0.6200 0.6200

Columns 190 through 196

0.5610 0.5610 0.5610 0.6028 0.6028 0.6028 0.5454

Columns 197 through 198

0.4935 0.4465
closed loop response :

0.8

0.7

0.6

0.5
output

0.4

0.3

0.2

0.1

0
0 1 2 3 4 5 6 7 8 9 10
time(sec)
neural net response :

0.8

0.7

0.6
y(neural net output)

0.5

0.4

0.3

0.2

0.1

0
0 1 2 3 4 5 6 7 8 9 10
time(sec)

network error plot

0.3

0.2

0.1

0
network error

-0.1

-0.2

-0.3

-0.4

-0.5
0 20 40 60 80 100 120
time(sec)
INFERENCE:
The neural net is trained for many epochs with the objective that the error should be minimum.
(minimum mean square error).

RESULT:

Thus neural net is designed for a simple transfer function system and is trained to get minimum
error.

Vous aimerez peut-être aussi