Vous êtes sur la page 1sur 5

Pattern Recognition

[LAB REPORT 02]

Sabit parvez
ID : 124440

Implementing the Perceptron algorithm for finding the weights of a Linear Discriminant function

Perceptron algorithm:
Task :
Given the following sample points in a 2-class problem: 1 = (1,1), (1,-1), (4,5) and 2 = (2,2.5), (0,2), (2,3)
1. Plot all sample points from both classes, but samples from the same class should have the same color and marker. Observer if
these two classes can be separated with a linear boundary.
2. Consider the case of a second order polynomial discriminant function. Repose the problem of finding such a nonlinear
discriminant function, as a problem of finding a linear discriminant function for a set of sample points of higher dimension.
Generate the high dimensional sample points. [Hint: -machine.]

3. Use Perceptron Algorithm (one at a time) for finding the weight-coefficients of the discriminant function boundary for your
linear classifier in question 2.

4. Draw the decision boundary between the two classes.

TASK 1.1 (PLOTTING TWO-CLASS SET OF PROTOTYPES) :


For the two given class we need to plot all the points. We used green color to represent class 1 and blue color to represent class
two. The figure of the correseponding results:

PAGE 1

Fig: plotting points from the both classes in the graph

CODE:
cls1 = [1 1;1 -1; 4 5];
cls2 = [2 2.5; 0 2; 2 3];
plot(cls1 (:,1), cls1(:,2),'s','LineWidth',2,'MarkerEdgeColor','k','MarkerFaceColor','g');
hold on;
plot(cls2(:,1), cls2(:,2),'s','LineWidth',2,'MarkerEdgeColor','m','MarkerFaceColor','b');
legend('class 1','class 2');
legend off;

TASK 1.2 (GENERATING THE HIGH DIMENSIONAL SAMPLE POINTS)


For generating high dimensional sample points we used the following formula:

y = [ x12 x22 x1*x1 x1 x2 1]

PAGE 2

CODE:
y = zeros(6,6);
y=[
cls1(1,1)*cls1(1,1) cls1(1,2)*cls1(1,2) cls1(1,1)*cls1(1,2) cls1(1,1) cls1(1,2) 1;
cls1(2,1)*cls1(2,1) cls1(2,2)*cls1(2,2) cls1(2,1)*cls1(2,2) cls1(2,1) cls1(2,2) 1;
cls1(3,1)*cls1(3,1) cls1(3,2)*cls1(3,2) cls1(3,1)*cls1(3,2) cls1(3,1) cls1(3,2) 1;
-cls2(1,1)*cls2(1,1) -cls2(1,2)*cls2(1,2) -cls2(1,1)*cls2(1,2) -cls2(1,1) -cls2(1,2) -1;
-cls2(2,1)*cls2(2,1) -cls2(2,2)*cls2(2,2) -cls2(2,1)*cls2(2,2) -cls2(2,1) -cls2(2,2) -1;
-cls2(3,1)*cls2(3,1) -cls2(3,2)*cls2(3,2) -cls2(3,1)*cls2(3,2) -cls2(3,1) -cls2(3,2) -1;
];
TASK 1.3 (USING PERCEPTRON ALGORITHM (ONE AT A TIME) FOR FINDING THE WEIGHT-COEFFICIENTS OF
THE DISCRIMINANT FUNCTION) :

CODE:
w= ones(1,6);
alpha= 1/6;
flag = 0;
counter = 0;
while counter < 200
counter = counter + 1;
flag = 0;
for i = 1:6
g = y(i,:)*w' ;
if g < 0
w = w + alpha * y(i,:);
else
flag = flag + 1;
end;
end;
if (flag == 6)
break ;
end;
end;
fprintf('W: %f %f %f %f %f %f\n', w);
fprintf('One at A Time-iteration:\t%d\n', counter);

PAGE 3

TASK 1.4 (DRAWING THE DECISION BOUNDARY BETWEEN THE TWO CLASSES) :

CODE:

syms x1 x2;
s=sym(w(1,1)*x1*x1+w(1,2)*x2*x2+w(1,3)*x1*x2+w(1,4)*x1+w(1,5)*x2+w(1,6));
s2=solve(s,x2);
xvals1=[-10:0.01:10];
xvals2(1,:)=subs(s2(1),x1,xvals1);
plot(xvals1,xvals2(1,:),'k');
grid, hold,

Discussion :
We have to reflect data in higher dimension. We have to use Perceptron Algorithm for finding the weight-coefficients of
the discriminant function boundary for the linear classifier . The perceptron algorithm is a gradient descent method,but doesn't get

stuck in local maxima either converges to global optimum or never converges.

PAGE 4

Vous aimerez peut-être aussi