Académique Documents
Professionnel Documents
Culture Documents
Puneet Kr Singh
Mtech (FT)
1st Yr
P K Singh, F O E, D E I
http://pksingh.webstarts.com/student_community.html
What is a Neural Network?
P K Singh, F O E, D E I
ANN as a Brain-Like Computer
NN as an model of An artificial neural network (ANN) is a
brain-like Computer massively parallel distributed processor that
has a natural propensity for storing
experimental knowledge and making it
available for use. It means that:
xn z w0 w1x1 ... wn xn
f ( x1 ,..., xn ) F ( w0 w1 x1 ... wn xn )
Where f is a function to be earned.
x1,..., xn are the inputs.
φ is the activation function.
P K Singh, F O E, D E I Z is the weighted sum
Artificial Neuron:
Classical Activation Functions
Linear activation Logistic activation
z z z
1
1 e z
1
Σ
z
z
0
z 0 z
-1
-1
P K Singh, F O E, D E I
Neural Network
Neural Network learns by adjusting the weights so as to be able to
correctly classify the training data and hence, after testing phase,
to classify unknown data.
P K Singh, F O E, D E I
Learning
The procedure that consists in estimating the parameters of neurons (setting up
the weights) so that the whole network can perform a specific task.
2 types of learning
Supervised learning
Unsupervised learning
P K Singh, F O E, D E I
Threshold Neuron (Perceptron)
• Output of a threshold neuron is binary, while inputs may be either
binary or continuous
• If inputs are binary, a threshold neuron implements a Boolean
function
• The Boolean alphabet {1, -1} is usually used in neural networks
theory instead of {0, 1}.
P K Singh, F O E, D E I
Threshold Boolean Functions: Geometrical
Interpretation
“OR” (Disjunction) is an example of the XOR is an example of the non-threshold (not linearly
threshold (linearly separable) Boolean function: separable) Boolean function: it is impossible
“-1s” are separated from “1” by a line separate “1s” from “-1s” by any single line
(-1, 1) (1, 1)
(-1, 1) (1, 1)
• 1 1 1 • 1 1 1
• 1 -1 -1 • 1 -1 -1
• -1 1 -1 • -1 1 -1
• -1 -1 -1 • -1 -1 1
P K Singh, F O E, D E I
Threshold Neuron: Learning
A main property of a neuron and of a neural network is their
ability to learn from its environment, and to improve its
performance through learning.
P K Singh, F O E, D E I
Threshold Neuron: Learning
Let T be a desired output of a neuron (of a network) for a certain
input vector and
Y be an actual output of a neuron.
P K Singh, F O E, D E I
Error-Correction Learning
If T≠Y , then T Y is the error .
A goal of learning is to adjust the weights in such a way that for a new
actual output we will have the following:
That is, the updated actual output must coincide with the desired
output.
Y Y T
The error-correction learning rule determines how the weights must
be adjusted to ensure that the updated actual output will coincide with
the desired output: W w , w , ..., w ; X x , ..., x
0 1 n 1 n
w 0 w0
w i wi x i ; i 1, ..., n
α is a learning rate (should be equal to 1 for the threshold neuron,
when a function to be learned is Boolean)
P K Singh, F O E, D E I
A Simplest Network
x1 Neuron 1
Neuron 3
x2 Neuron 2
P K Singh, F O E, D E I
Solving XOR problem using the simplest network
x1 x2 x1 x2 x1 x2 f1 ( x1 , x2 ) f 2 ( x1 , x2 )
x1 N1
-3
1 N3
3
3 -1
3 3
3
-1
x2 N2
P K Singh, F O E, D E I
Solving XOR problem using the simplest network
1) 1 1 1 1 5 1 5 1 1
2) 1 -1 -5 -1 7 1 -1 -1 -1
3) -1 1 7 1 -1 -1 -1 -1 -1
4) -1 F O -1
P K Singh, E, D E I 1 1 1 1 5 1 1
Neural Networks
Components – biological plausibility
Neurone / node
Synapse / weight
Recurrent networks
Multidirectional flow of information
Memory / sense of time
Complex temporal dynamics (e.g. CPGs)
Various
P K training
Singh, Fmethods
O E, D E(Hebbian,
I evolution)
Often better biological models than FFNs
P K Singh, F O E, D E I
P K Singh, F O E, D E I
BACK PROPAGATION
For each sample, weights are modified to minimize the error between network’s
classification and actual classification.
P K Singh, F O E, D E I
Steps in Back propagation Algorithm
P K Singh, F O E, D E I
Steps in Back propagation Algorithm
( cont..)
P K Singh, F O E, D E I
Back propagation Formula
Output vector
Errk Ok (1 Ok )(Tk Ok )
Output nodes
1 Err j O j (1 O j ) Errk w jk
Oj I k
1 e j
Hidden nodes
Input vector: xi
P K Singh, F O E, D E I
Example of Back propagation
Input = 3, Hidden
Neuron = 2 Output =1
Initialize weights :
Random Numbers
from -1.0 to 1.0
P K Singh, F O E, D E I
Example ( cont.. )
Bias ( Random )
θ4 θ5 θ6
Data
Sources
Steve Simpson
David Raubenheimer
Format
Frequency distribution (60 bins)
Analogy: cochlea
P K Singh, F O E, D E I
Network architecture
Feed forward network
60 input (one for each frequency bin)
6 hidden
2 output (0-1 for “Steve”, 1-0 for “David”)
P K Singh, F O E, D E I
Presenting the data
Steve
David
P K Singh, F O E, D E I
Presenting the data (untrained network)
Steve
0.43
0.26
David
0.73
0.55
P K Singh, F O E, D E I
Calculate error
Steve
0.43 – 0 = 0.43
0.26 –1 = 0.74
David
0.73 – 1 = 0.27
0.55 – 0 = 0.55
P K Singh, F O E, D E I
Backprop error and adjust weights
Steve
0.43 – 0 = 0.43
0.26 – 1 = 0.74
1.17
David
0.73 – 1 = 0.27
0.55 – 0 = 0.55
P K Singh, F O E, D E I
0.82
Presenting the data (trained network)
Steve
0.01
0.99
David
0.99
0.01
P K Singh, F O E, D E I
Results –Voice Recognition
P K Singh, F O E, D E I
Neural Network as Function Approximation
P K Singh, F O E, D E I
Stabilizing Controller
This scheme has been applied to the control of robot arm trajectory, where a
proportional controller with gain was used as the stabilizing feedback
controller.
We can see that the total input that enters the plant is the sum of the
feedback control signal and the feed-forward control signal, which is
calculated from the inverse dynamics model (neural network).
That model uses the desired trajectory as the input and the feedback control
as an error signal. As the NN training advances, that input will converge to
zero.
The neural network controller will learn to take over from the feedback
controller. The advantage of this architecture is that we can start with a stable
system, even though the neural network has not been adequately trained.
P K Singh, F O E, D E I
Stabilizing Controller
P K Singh, F O E, D E I
Image Recognition:
Decision Rule and Classifier
Is it possible to formulate (and formalize!) the decision rule, using
which we can classify or recognize our objects basing on the
selected features?
Can you propose the rule using which we can definitely decide is
it a tiger or a rabbit?
P K Singh, F O E, D E I
Image Recognition: Decision Rule and classifier
Once we know our decision rule, it is not difficult to develop a classifier,
which will perform classification/recognition using the selected features
and the decision rule.
However, if the decision rule can not be formulated and formalized, we
should use a classifier, which can develop the rule from the learning process
In the most of recognition/classification problems, the formalization of the
decision rule is very complicated or impossible at all.
A neural network is a tool, which can accumulate knowledge from the
learning process.
After the learning process, a neural network is able to approximate a
function, which is supposed to be our decision rule
P K Singh, F O E, D E I
Why neural network?
f ( x1 ,..., xn ) - unknown multi-factor decision rule
n mp
m2
m3
2. Mathematical model of quantization:
“Learning by Examples”
Input Patterns Response:
x11 y11
1 1
y
x yi 2
xi 2
1
1 yn
xn
P K Singh, F O E, D E I
Application of Artificial Neural Network in Fault
Detection Study of Batch Esterification Process
The complexity of most chemical industry always tends to create a problem in
monitoring and supervision system.
Prompt fault detection and diagnosis is a best way to handle and tackle this problem.
There are different methods tackling different angle. One of the popular methods is
artificial neural network which is a powerful tool in fault detection system.
In this, a production of ethyl acetate by a reaction of acetic acid and ethanol in a
batch reactor is applied.
The neural network with normal and faulty event is executed on the data collected
from the experiment.
The relationship between normal-faulty events is captured by training network
topology.
The ability of neural network to detect any process faults is based on their ability to
learn from example and requiring little knowledge about the system structure.
P K Singh, F O E, D E I
CONCLUSION Fault diagnosis for pilot-plant batch esterification process is
investigated in this work by a feed forward neural model by implementing multilayer
perceptron. The effect of catalyst concentration and catalyst volume are studied and
classified successfully using the neural process model. The results displayed that
neural network is able to detect and isolate two fault studies with a nice pattern
classification.
P K Singh, F O E, D E I
Temperature control in fermenters: application of
neural nets and feedback control in breweries
The main objective of on-line quality control in fermentation is to perform the production
processes as reproducible as possible.
Since temperature is the main control parameter in the fermentation process of beer
breweries, it is of primary interest to keep it close to the predefined set point. Here, we
report on a model-supported temperature controller for large production-scale beer
fermenters.
The dynamic response of the temperature in the tank on temperature changes in the cooling
elements has been modeled by means of a difference equation.
The heat production within the tank Is taken into account by means of a model for the
substrate degradation.
Any optimization requires a model to predict the consequences of actions. Instead of using a
conventional mathematical model of the fermentation kinetics, an artificial neural network
approach has been used.
The set point profiles for the temperature control have been dynamically optimized in order
to minimize the production cost while meeting the constraints posed by the product quality
requirements.
P K Singh, F O E, D E I
P K Singh, F O E, D E I
P K Singh, F O E, D E I
P K Singh, F O E, D E I
P K Singh, F O E, D E I
P K Singh, F O E, D E I
Applications of Artificial Neural Networks
Intelligent
Advance Control
Robotics Technical
Diagnistic
s
Machine Intelligent
Vision Data Analysis
Artificial and Signal
Intellect with Processing
Neural
Networks
Image &
Pattern
Recognition Intelligent
Expert
Systems
Intelligent Intelligent
l Security
Medicine Systems
Devices
P K Singh, F O E, D E I
Applications: Classification
Business
•Credit rating and risk assessment
Security
•Insurance risk evaluation •Face recognition
•Fraud detection •Speaker verification
•Insider dealing detection •Fingerprint analysis
•Marketing analysis
•Signature verification
•Inventory control
Medicine
•General diagnosis
Engineering
•Detection of heart defects
•Machinery defect diagnosis
•Signal processing
•Character recognition
•Process supervision Science
•Process fault analysis •Recognising genes
•Speech recognition •Botanical classification
•Machine vision •Bacteria identification
•Speech recognition
•Radar signal classification
P K Singh, F O E, D E I
Applications: Modeling
Business
•Prediction of share and commodity prices
•Prediction of economic indicators
•Insider dealing detection
•Marketing analysis
•Signature verification
•Inventory control Science
•Prediction of the performance of
drugs from the molecular structure
Engineering •Weather prediction
•Transducer linerisation •Sunspot prediction
•Colour discrimination
•Robot control and navigation
•Process control Medicine
•Aircraft landing control •. Medical imaging
•Car active suspension control and image processing
•Printed Circuit auto routing
•Integrated circuit layout
•Image compression
P K Singh, F O E, D E I
Applications: Forecasting
•Future sales
•Production Requirements
•Market Performance
•Economic Indicators
•Energy Requirements
•Time Based Variables
P K Singh, F O E, D E I
Applications: Novelty Detection
•Fault Monitoring
•Performance Monitoring
•Fraud Detection
•Detecting Rate Features
•Different Cases
P K Singh, F O E, D E I
Thank you
For any suggestion …..
http://pksingh.webstarts.com/student_community.html
P K Singh, F O E, D E I