Vous êtes sur la page 1sur 24

ARTIFICIAL NEURAL NETWORK

PRESENTED BY Saranya(1155-06)

INTRODUCTION

Artificial neural network (ANN) is an information processing paradigm inspired by biological nervous systems.

ANN have the ability to learn from experience

HISTORY

1943 McCulloch and Pitts proposed the McCulloch-Pitts neuron model 1958 Rosenblatt introduced the simple single layer networks now called Perceptrons. 1982 Hopfield published a series of papers on Hopfield networks. 1982 Kohonen developed the Self-Organising Maps that now bear his name. 1986 The Back-Propagation learning algorithm for MultiLayer Perceptrons was rediscovered and the whole field took off again. 1990s The sub-field of Radial Basis Function Networks was developed. 2000s The power of Ensembles of Neural Networks and Support Vector Machines becomes apparent.

BIOLOGICAL NEURON

Basic element of human brain Many inputs and one output Four main components

Soma Axon Dendrite Synapses

PRECEPTRON

SIMPLE ARTIFICIAL NEURON

TRAINING
There are two approaches of training Supervised Training: The network is provided with the desired output. Unsupervised Training: The network has to make sense of the inputs without outside help.

LEARNING LAWS

HEBBS RULE
If a neuron receives an input from another neuron and if both are highly active, the weight between the neurons should be strengthened HOPFIELD LAW If the desired and actual output are both active or both inactive, increment the connection weight by the learning rate, otherwise decrement the weight by the learning rate

LEARNING LAWS CONTINUED

THE DELTA RULE The delta error is back-propagated into previous layer one layer at a time. THE GRADIENT DESCENT RULE
KOHONENS LEARNING LAW

COMPONENTS

Weighting Factors: Adaptive coefficients that determine intensity of input signal Summation Function: Computes the product sum of the input. Transfer Function: Transforms the weighted sum to a working output Scaling and Limiting: Scaling multiplies a scale factor to transfer value and then adds an offset and limiting maintains upper and lower bound Output Function: Neurons compete with each other, inhibiting processing elements.

SUMMATION FUNCTION SUM MAX MIN AVG OR

TRANSFER FUNCTION

HYPERBOLIC TANGENT
SIGMOID LINEAR SINE ETC

AND
ETC

NEURAL NETWORK V/S TRADITIONAL COMPUTING

NEURAL NETWORKS V/S EXPERT COMPUTING

TYPES OF NEURAL NETWORK

FEED-FORWARD,BACK PROPAGATION NETWORK

FEED-FORWARD,BACK PROPAGATION NETWORK CONTINUED

It consist of an input layer, an output layer and at least one hidden layer Three general rules are used to create a feed-forward, back propagation network Delta rule is used for training this network Lots of supervised learning with lots of input is needed for training these networks Its application includes speech synthesis from text, robot arms, evaluation of bank loans, image processing, knowledge representation, forecasting and prediction, and multi-target tracking.

RECURRENT NEURAL NETWORK


In recurrent neural network there is bi-directional data flow

HOPFIELD NETWORK

HOPFIELD NETWORK

The network can be conceptualized in terms of its energy and the physics of dynamic systems Uses three layer input buffer, a Hopfield layer and an output layer and each layer has same number of processing element The learning rule used to train this network is the Hopfield law The storage capacity is limited and the network becomes unstable when common pattern shared are too similar

HAMMING NETWORK

HAMMING NETWORK

Extension of the Hopfield network Implements a classifier based on least error for binary input vectors defined by hamming distance It has three layers input, category and output layer Learning is similar to Hopfield methodology Advantages include the fact that fewer processing elements are required and it is both faster and accurate than the Hopfield network

SELF-ORGANIZING MAP

The processing elements represent a 2-dimensional map of the input layer Learns without supervision

It typically has two layers. The input layer is connected to a 2-D Kohonen layer The output trains using the delta rule

APPLICATION

Language Processing Character Recognition Image (Data) Compression Pattern Recognition Signal Processing Financial

CONCLUSION

ANN have the ability to perform tasks outside the scope of traditional processors ANN learn and they are not programmed ANN along with fuzzy logic and expert system will be able to read handwritings, hear speech and formulate action and thus become the leading edge to intelligent machine

REFERENCE

JOURNALS
Broussard, R.P.; Kennell, L.R.; Soldan, D.L.; Ives,R.W.;Using Artificial Neural Networks and Feature Saliency Techniques for Improved Iris Segmentation IEEE publication Ye Chow Kuang; Ooi, M.P.; A Faster and Cheaper Method of Implementing States Observers using Artificial Neural Networks IEEE publication

Q U E R I E S

Vous aimerez peut-être aussi