Vous êtes sur la page 1sur 16

Experiment No 1

Aim: To study the introduction to Neural Network Toolbox. Tool Used: MATLAB 7 Theory: The term neural network was traditionally used to refer to a network or circuit of biological neurons The modern usage of the term often refers to ANN which are composed of artificial neuron or nodes. Thus the term has two distinct usages: 1. Biological neuron network are made up of real biological neurons that are connected or functionally related in a nervous system. In the field of neurons, they are often identified as groups of neurons that perform a specific physiological function in laboratory analysis. 2. ANN are composed of interconnecting artificial neurons (programming constructs that mimic the properties of biological neurons). Artificial neural networks may either be used to gain an understanding of biological neural networks, or for solving artificial intelligence problems without necessarily creating a model of a real biological system. The real, biological nervous system is highly complex: artificial neural network algorithms attempt to abstract this complexity and focus on what may hypothetically matter most from an information processing point of view. Good performance (e.g. as measured by good predictive ability, low generalization error), or performance mimicking animal or human error patterns, can then be used as one source of evidence towards supporting the hypothesis that the abstraction really captured something important from the point of view of information processing in the brain. Another incentive for these abstractions is to reduce the amount of computation required to simulate artificial neural networks, so as to allow one to experiment with larger networks and train them on larger data sets.

Neural Network Applications: Neural networks have been applied in many other fields. A list of some applications mentioned in the literature follows. Aerospace- High performance aircraft autopilot, flight path simulation, aircraft control systems, autopilot enhancements, aircraft component simulation, aircraft component fault detection Automotive- Automobile automatic guidance system, warranty activity analysis Banking- Check and other document reading, credit application evaluation Defense- Weapon steering, target tracking, object discrimination, facial recognition, new kinds of sensors, sonar, radar and image signal processing including data compression, feature extraction and noise suppression, signal/image identification Electronics- Code sequence prediction, integrated circuit chip layout, process control, chip failure analysis, machine vision, voice synthesis, nonlinear modeling Financial- Real estate appraisal, loan advisor, mortgage screening, corporate bond rating, credit-line use analysis, portfolio trading program, corporate financial analysis, currency price prediction Industrial- Neural networks are being trained to predict the output gasses of furnaces and other industrial processes. They then replace complex and costly equipment used for this purpose in the past. Medical- Breast cancer cell analysis, EEG and ECG analysis, prosthesis design, optimization of transplant times, hospital expense reduction, hospital quality improvement, emergencyroom test advisement Robotics- Trajectory control, forklift robot, manipulator controllers, vision systems Telecommunications- Image and data compression, automated information services, realtime translation of spoken language, customer payment processing systems Speech- Speech recognition, speech compression, vowel classification, text-to-speech synthesis Securities- Market analysis, automatic bond rating, stock trading advisory systems

Transfer Functions: Three of the most commonly used functions are shown below. 1. Hard Limit Transfer function (hardlim)

The hard-limit transfer function shown above limits the output of the neuron to either 0, if the net input argument n is less than 0; or 1, if n is greater than or equal to 0. 2. Linear Transfer Function (purelin)

Linear TF gives the linear output to respected input. Neurons of this type are used as linear approximations in Linear Filters. 3. Log-Sigmoid Transfer Function (logsim)

The sigmoid transfer function shown above takes the input, which may have any value between plus and minus infinity, and squashes the output into the range 0 to 1. This transfer function is commonly used in back propagation networks, in part because it is differentiable. Architecture : The basic architecture consists of three types of neuron layers: input, hidden, and output. In feedforward networks, the signal flow is from input to output units, strictly in a feed-forward direction. The data processing can extend over multiple layers of units, but no feedback

connections are present. Recurrent networks contain feedback connections. Contrary to feedforward networks, the dynamical properties of the network are important. In some cases, the activation values of the units undergo a relaxation process such that the network will evolve to a stable state in which these activations do not change anymore. In other applications, the changes of the activation values of the output neurons are significant, such that the dynamical behavior constitutes the output of the network. Other neural network architectures include ART maps and competitive networks Conclusion: We have studied introduction to Neural Network toolbox in MATLAB.

Experiment No 2

Aim: write a program to demonstrate a neural network function through matlab. Tool Used: MATLAB 7 Theory: 1. hardlim - Hard-limit transfer function Graph-

SyntaxA = hardlim(N,FP) dA_dN = hardlim('dn',N,A,FP) info = hardlim('code')

Descriptionhardlim

is a neural transfer function. Transfer functions calculate a layer's output from its net

input. 2. purelin - Linear transfer function

Graph-

SyntaxA = purelin(N,FP) dA_dN = purelin('dn',N,A,FP) info = purelin('code') Descriptionpurelin is a neural transfer function. Transfer functions calculate a layer's output from its net input.

Programp = [8 7 6 5 4 3 2 1 0]; t = [0 0.80 0.95 0.15 -0.56 -0.96 -0.30 0.66 0.89]; plot(p,t,'o') net = newff([0 8],[10 1],{'hardlims' 'purelin'},'trainlm'); y1 = sim(net,p) plot(p,t,'o',p,y1,'x') net.trainParam.epochs = 50; net.trainParam.goal = 0.01; net = train(net,p,t); y2 = sim(net,p) plot(p,t,'o',p,y1,'x',p,y2,'*')

output1

0.8

0.6

0.4

0.2

-0.2

-0.4

-0.6

-0.8

-1

2.

Best Training Performance is 0.42474 at epoch 2


10
1

Train Best Goal

Mean Squared Error (mse)

10

10

-1

10

-2

10

-3

0.2

0.4

0.6

0.8

1.2

1.4

1.6

1.8

2 Epochs

3.

10

20

Gradient = 5.2755e-011, at epoch 2

gradient

10 10

-20

10

-3

Mu = 1e-005, at epoch 2

mu

10 10

-4

-5

Validation Checks = 0, at epoch 2 1

val fail

0 -1

0.2

0.4

0.6

0.8

1 1.2 2 Epochs

1.4

1.6

1.8

.
Training: R=-3.8519e-034
0.8 0.6 0.4 0.2 0 -0.2 -0.4 -0.6 -0.8 Data Fit Y = T

Output ~= 3.9e-018*Target + 0.18

-0.8

-0.6

-0.4

-0.2

0.2

0.4

0.6

0.8

Target

Conclusion- Required output is obtain by using hardlim function and purelin function.

Experiment No 3

Aim: write a program to demonstrate a neural network function tansig and newff through matlab. Tool Used: MATLAB 7 Theory: Explanation and syntax1. CREATING A PERCEPTRON (newp) A perceptron can be created with the newp function: net = newp(P,T) where input arguments are as follows: P is an R-by-Q matrix of Q input vectors of R elements each. T is an S-by-Q matrix of Q target vectors of S elements each. Commonly, the hardlim function is used in perceptrons, 2. TANSIG- Hyperbolic tangent sigmoid transfer function SYNTAX A = tansig(N,FP) dA_dN = tansig('dn',N,A,FP) DESCRIPTION tansig is a neural transfer function. Transfer functions calculate a layer's output from its net input. PROGRAMp = [0 1 2 3 4 5 6 7 8]; t = [0 0.84 0.91 0.14 -0.77 -0.96 -0.28 0.66 0.99]; plot(p,t) net = newff([0 8],[10 1],{'tansig' 'purelin'},'trainlm'); y1 = sim(net,p)

plot(p,t,p,y1) net.trainParam.epochs = 50; net.trainParam.goal = 0.01; net = train(net,p,t); y2 = sim(net,p) plot(p,t,,p,y1,p,y2) OUTPUT-

3 2.5 2 1.5 1 0.5 0 -0.5 -1 -1.5

Graph-1

G-2.
Best Training Performance is 9.3375e-006 at epoch 2
Train Best Goal

10

Mean Squared Error (mse)

10

-2

10

-4

10

-6

0.2

0.4

0.6

0.8

1.2

1.4

1.6

1.8

2 Epochs

G-3.
Gradient = 0.0070003, at epoch 2

10

gradient

10 10

-5

10

-3

Mu = 1e-005, at epoch 2

mu

10 10

-4

-5

Validation Checks = 0, at epoch 2 1

val fail

0 -1

0.2

0.4

0.6

0.8

1 1.2 2 Epochs

1.4

1.6

1.8

CONCLUSION- Required output is obtain by using NEWFF function and TANSIG function.

Experiment No 4
Aim: write a program to demonstrate a neural network function through matlab. Tool Used: MATLAB 7 Theory: CLASSIFICATION WITH A 2 PERCEPTRON

SIMUP - Simulates a perceptron layer. TRAINP - Trains a perceptron layer with perceptron rule. Using the above functions a 2-input hard limit neuron is trained to classify 4 input vectors into two categories. DEFINING A CLASSIFICATION PROBLEM

A row vector P defines four 2-element input vectors: P = [-0.5 -0.5 +0.3 +0.0; -0.5 +0.5 -0.5 +1.0]; A row vector T defines the vector's target categories. T = [1 1 0 0]; PLOTTING THE VECTORS TO CLASSIFY

We can plot these vectors with PLOTPV: plotpv(P,T); The perceptron must properly classify the 4 input vectors in P into the two categories defined by T. DEFINE THE PERCEPTRON Perceptrons have HARDLIM neurons. These neurons are capable of separating an input pace with a straight line into two categories (0 and 1). INITP generates initial weights and biases for our neuron: [W,b] = initp(P,T) INITIAL PERCEPTRON CLASSIFICATION The input vectors can be replotted... plotpv(P,T) with the neuron's initial attempt at classification. INITP - Initializes a perceptron layer. [W,B] = INITP(P,T) P - RxQ matrix of input vectors.

T - SxQ matrix of target outputs. Returns weights and biases. PROGRAM-

P = [+0.1 +0.2 +0.3 +0.4 ; +0.5 +0.6 +0.3 +0.5 ]; T = [0.6 0.8 0.6 0.9 ]; plot(P,T); [W,b] = double (P,T) figure; plot(P,T); figure; plotpc(W,b); [W,b,epochs,errors] = trainp(W,b,P,T,1); figure; ploterr(errors) display(' see below format to get output from trained network ') display('p = [-0.5; 0.5];') display ('a = simup(p,W,b)')

OUTPUT-

0.95

0.9

0.85

0.8

0.75

0.7

0.65

0.6

0.55 0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

0.55

0.6

Experiment No 5
Aim: write a program to demonstrate a neural network function through matlab. Tool Used: MATLAB 7 THEORYlogsig - Log-sigmoid transfer function

Graph and Symbol

SyntaxA = logsig(N,FP) dA_dN = logsig('dn',N,A,FP) info = logsig('code')

Descriptionlogsig is a transfer function. Transfer functions calculate a layer's output from its net input. A = logsig(N,FP) takes N and optional function parameters, N S-by-Q matrix of net input (column) vectors FP Struct of function parameters (ignored)

and returns A, the S-by-Q matrix of N's elements squashed into [0, 1].

Programp = [0 1 2 3 4 5 6 7 8]; t = [0 0.84 0.91 0.14 -0.77 -0.96 -0.28 0.66 0.99]; plot(p,t) net = newff([0 8],[10 1],{'logsig' 'purelin'},'trainlm'); y1 = sim(net,p) plot(p,t,p,y1) net.trainParam.epochs = 50; net.trainParam.goal = 0.01; net = train(net,p,t); sim=(net,p) plot(p,t,p,y1,p,y2)

output-

1.5

0.5

-0.5

-1

Graph-1

G-2
Best Training Performance is 0.0017424 at epoch 1
10
0

Train Best Goal

Mean Squared Error (mse)

10

-1

10

-2

10

-3

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1 Epochs

G-3
0

Gradient = 0.063164, at epoch 1

10

gradient

10 10

-1

-2

10

-3

Mu = 0.0001, at epoch 1

mu
10
-4

Validation Checks = 0, at epoch 1 1

val fail

0 -1

0.1

0.2

0.3

0.4

0.5 0.6 1 Epochs

0.7

0.8

0.9

CONCLUSION- logsig and purlin function is studied and desire output is obtain

Vous aimerez peut-être aussi