Vous êtes sur la page 1sur 86

STUDENT ATTENDANCE SYSTEM USING

FACE RECOGNITION
A PROJECT REPORT
Submitted to

Jawaharlal Nehru Technological University Kakinada, Kakinada


in partial fulfilment for the award of the degree of
Bachelor of Technology
in

COMPUTER SCIENCE AND ENGINEERING


Submitted by

N. Gopi (15KN1A05B7) M. Tarun (15KN1A05A1)


N. Sowmya (15KN1A0576) M. Sundar Kishore (15KN1A0594)
M. Jayanth (15KN1A0588)

Under the esteemed guidance of

Dr. D RATNA KISHORE


PROFESSOR, CSE Department

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING


NRI INSTITUTE OF TECHNOLOGY
(Approved by AICTE, Permanently Affiliated to JNTUK, Kakinada)
(Accredited by NAAC with ‘A’ Grade, ISO 9001 : 2015 Certified)
Pothavarappadu (V), Agiripalli (M), Krishna Dist, PIN: 521212, A.P, India.
2015-2019
NRI INSTITUTE OF TECHNOLOGY
(Approved by AICTE, Permanently Affiliated to JNTUK, Kakinada)
(Accredited by NAAC with ‘A’ Grade, ISO 9001 : 2015 Certified)
Pothavarappadu (V), Agiripalli (M), Krishna Dist, PIN: 521212, A.P, India.

Certificate
This is to certify that the Project entitled “STUDENT ATTENDANCE
SYSTEM USING FACE RECOGNITION” is a bonafide work carried out by
in N. Gopi (15KN1A05B7), M. Tarun (15KN1A05A1), N. Sowmya (15KN1A0576),
M. Sundar Kishore (15KN1A0594), M. Jayanth (15KN1A0588) partial fulfilment for the
award of degree of Bachelor of Technology in Computer Science &
Engineering of Jawaharlal Nehru Technological University Kakinada,
Kakinada during the year 2018-2019.

(Dr. D Ratna kishore) ( Dr. K.V. Sambasiva Rao)

Project Guide Head of the Department

EXTERNAL EXAMINER
ACKNOWLEDGEMENT

I take this opportunity to thank all who have rendered their full support to my work.
The pleasure, the achievement, the glory, the satisfaction, the reward, the appreciation and the
construction of my project cannot be expressed with a few words for their valuable suggestions.

I am grateful to my Project Guide Dr. D. Ratna Kishore, Professor for rendering the
valuable suggestions and extended his support to complete the project successfully.

I am expressing my heartfelt thanks to Dr. K.V. Sambasiva Rao garu, Head of the
Department, for his continuous guidance for completion of my Project work.

I am thankful to the Dr. C. Naga Bhaskar garu, Principal for his encouragement to
complete the Project work.

I am extending my sincere and honest thanks to the Dr. R. Venkata Rao garu &
Chairman, Secretary, Sri K. Sridhar garu for their continuous support in completing the
Project work.

Finally, I thank the Administrative Officer, Staff Members, Faculty of Department of


CSE, NRI Institute of Technology and my friends, directly or indirectly helped us in the
completion of this project.

15KN1A05B7(N. GOPI)
15KN1A05A1(M. TARUN)
15KN1A0576(K. SOWMYA)
15KN1A0594(M. SUNDAR KISHORE)
15KN1A0588(M. JAYANTH)
INDEX

I. List of Figures
II. List of Abbreviation
1. Introduction 4
1.1. Introduction to project 5
1.2. Problem Definition
1.3. Objectives
1.4. Scope of the project
1.5. Process Diagram 7
2. Literature Review
2.1. Digital image processing
2.2. Applications
3. Methodology 8
4. System Analysis 10
4.1. Existing System 11
4.2. Proposed System
4.3. Modules 13
5. Introduction to MAT Lab
5.1. What is MAT Lab
5.2.
6. Conclusion 16
7. System Requirement Specification 17
7.1. Functional Requirements
7.2. Non-Functional Requirements
7.3. System Requirements 20
7.3.1. Software Requirements
7.3.2. Hardware Requirements
8. System Design 21
8.1. UML Modeling 22
7.1.1. Importance of UML in Modeling
7.2. Class Diagram 23
8.2. Use-Case Diagram
8.3. Sequence Diagram 29
8.4. Component Diagram

9. Coding 34
9.1. Sample Code 39
10. Testing 49
11. Screen Shots
12. Future Enhancement 63
12.Bibliography
Abstract:

Face detection (human) plays an important role in applications such as human computer

interface, face recognition video surveillance and face image database management. In the

human face detection applications, face(s) most frequently form an inconsequential part of the

images. Consequently, preliminary segmentation of images into regions that contain "non-

face" objects and regions that may contain "face" candidates can greatly accelerate the process

of human face detection. Most existing face detection approaches have assumptions, which

make them applicable only under some specific conditions. Existing techniques for face

detection in colour images are plagued by poor performance in the presence of scale variation,

variation in illumination, variation in skin colours, complex backgrounds etc. In this research

work we have made a humble attempt to propose an algorithm for face detection in colour

images in the presence of varying lighting conditions, for varied skin colours as well as with

complex backgrounds. Based on a novel tangible skin component extraction modus operandi

and detection of the valid face candidates, our method detects skin regions over the entire image

and engenders face candidates based on the signatures of the detected skin patches. The

algorithm constructs the boundary for each face candidate.


List of Figures
Names Page No

1. Process Diagram
2. Class Diagram
3. Use Case Diagram
4. Activity Diagram
5. Sequence Diagram
6. Screenshots
List of Abbreviations
Format name Description

TIFF -- Tagged Image File Format

JPEG -- Joint Photograph Experts Group

GIF -- Graphics Interchange Format

BMP -- Windows Bitmap

PNG -- Portable Network Graphics

XWD -- X Window Dump


CHAPTER – 1

INTRODUCTION
INTRODUCTION

1.1 Introduction to project

Face recognition is an important application of Image processing owing to its use in

many fields. Identification of individuals in an organization for the purpose of attendance is

one such application of face recognition. Maintenance and monitoring of attendance records

plays a vital role in the analysis of performance of any organization. The purpose of developing

attendance management system is to computerize the traditional way of taking attendance.

Automated Attendance Management System performs the daily activities of attendance

marking and analysis with reduced human intervention. The prevalent techniques and

methodologies for detecting and recognizing face fail to overcome issues such as scaling, pose,

illumination, variations, rotation, and occlusions. The proposed system aims to overcome the

pitfalls of the existing systems and provides features such as detection of faces, extraction of

the features, detection of extracted features, and analysis of students' attendance. The system

integrates techniques such as image contrasts, integral images, color features and cascading

classifier for feature detection. The system provides an increased accuracy due to use of a large

number of features (Shape, Colour, LBP, wavelet, Auto-Correlation) of the face. Faces are

recognized using Euclidean distance and k-nearest neighbor algorithms. Better accuracy is

attained in results as the system takes into account the changes that occur in the face over the

period of time and employs suitable learning algorithms. The system is tested for various use

cases. We consider a specific area such as classroom attendance for the purpose of testing the

accuracy of the system. The metric considered is the percentage of the recognized faces per

total number of tested faces of the same person. The system is tested under varying lighting

conditions, various facial expressions, presence of partial faces (in densely populated

classrooms) and presence or absence of beard and spectacles. An increased accuracy (nearly

100%) is obtained in most of the cases considered.


1.2 Problem Definition

The traditional manual methods of monitoring student attendance in lectures are tedious as the

signed attendance sheets have to be manually logged in to a computer system for analysis. This

is tedious, time consuming and prone to inaccuracies as some students in the department often

sign for their absent colleagues, rendering this method ineffective in tracking the students’ class

attendance. Use of the face detection and recognition system in lieu of the traditional methods

will provide a fast and effective method of capturing student attendance accurately while

offering a 2 secure, stable and robust storage of the system records, where upon authorization;

one can access them for purposes like administration, parents or even the students themselves.

1.3 OBJECTIVES

The objectives of the project are given below:

1. Detection of unique face image amidst the other natural components such as walls,

backgrounds etc.

2. Extraction of unique characteristic features of a face useful for face recognition.

3. Detection of faces amongst other face characters such as beard, spectacles etc.

4. Effective recognition of unique faces in a crowd (individual recognition in crowd).

5. Automated update in the database without human intervention.

1.4 Scope of the project

This module is a desktop application that does face recognition of the captured images (faces)

in the file, marks the students register and then stores the results in a database for future

analysis.
1.5 Process Diagram
CHAPTER – 2
LITERATURE REVIEW
2 DIGITAL IMAGE PROCESSING

2.1 BACKGROUND:

Digital image processing is an area characterized by the need for extensive experimental

work to establish the viability of proposed solutions to a given problem. An important

characteristic underlying the design of image processing systems is the significant level of

testing & experimentation that normally is required before arriving at an acceptable solution.

This characteristic implies that the ability to formulate approaches &quickly prototype

candidate solutions generally plays a major role in reducing the cost & time required to arrive

at a viable system implementation.

2.2 What is DIP?

An image may be defined as a two-dimensional function f(x, y), where x & y are spatial

coordinates, & the amplitude of f at any pair of coordinates (x, y) is called the intensity or

gray level of the image at that point. When x, y & the amplitude values of f are all finite

discrete quantities, we call the image a digital image. The field of DIP refers to processing

digital image

by means of digital computer. Digital image is composed of a finite number of elements, each

of which has a particular location & value. The elements are called pixels.

Vision is the most advanced of our sensor, so it is not surprising that image play the

single most important role in human perception. However, unlike humans, who are limited to

the visual band of the EM spectrum imaging machines cover almost the entire EM spectrum,

ranging from gamma to radio waves. They can operate also on images generated by sources

that humans are not accustomed to associating with image.


There is no general agreement among authors regarding where image processing stops

& other related areas such as image analysis& computer vision start. Sometimes a distinction

is made by defining image processing as a discipline in which both the input & output at a

process are images. This is limiting & somewhat artificial boundary. The area of image analysis

(image understanding) is in between image processing & computer vision.

There are no clear-cut boundaries in the continuum from image processing at one end to

complete vision at the other. However, one useful paradigm is to consider three types of

computerized processes in this continuum: low-, mid-, & high-level processes. Low-level

process involves primitive operations such as image processing to reduce noise, contrast

enhancement & image sharpening. A low- level process is characterized by the fact that both

its inputs & outputs are images. Mid-level process on images involves tasks such as

segmentation, description of that object to reduce them to a form suitable for computer

processing & classification of individual objects. A mid-level process is characterized by the

fact that its inputs generally are images but its outputs are attributes extracted from those

images. Finally higher- level processing involves “Making sense” of an ensemble of

recognized objects, as in image analysis & at the far end of the continuum performing the

cognitive functions normally associated with human vision.

Digital image processing, as already defined is used successfully in a broad range of areas

of exceptional social & economic value.

2.3 What is an image?

An image is represented as a two dimensional function f(x, y) where x and y are spatial

co-ordinates and the amplitude of ‘f’ at any pair of coordinates (x, y) is called the intensity of

the image at that point.


2.4 Gray scale image:

A grayscale image is a function I(xylem) of the two spatial coordinates of the image

plane.

I(x, y) is the intensity of the image at the point (x, y) on the image plane.

I (xylem)takes non-negative values assume the image is bounded by arectangle[0, a] [0, b]I:

[0, a]  [0, b]  [0, info)

2.5 Colour image:

It can be represented by three functions, R (xylem)for red,G (xylem)for green andB

(xylem)for blue.

An image may be continuous with respect to the x and y coordinates and also

in amplitude. Converting such an image to digital form requires that the coordinates as well as

the amplitude to be digitized. Digitizing the coordinate’s values is called sampling. Digitizing

the amplitude values is called quantization.

2.6 Coordinate convention:

The result of sampling and quantization is a matrix of real numbers. We use two principal

ways to represent digital images. Assume that an image f(x, y) is sampled so that the resulting

image has M rows and N columns. We say that the image is of size M X N. The values of the

coordinates (xylem) are discrete quantities. For notational clarity and convenience, we use

integer values for these discrete coordinates. In many image processing books, the image origin

is defined to be at (xylem)=(0,0).The next coordinate values along the first row of the image

are (xylem)=(0,1).It is important to keep in mind that the notation (0,1) is used to signify the
second sample along the first row. It does not mean that these are the actual values of physical

coordinates when the image was sampled. Following figure shows the coordinate convention.

Note that x ranges from 0 to M-1 and y from 0 to N-1 in integer increments.

The coordinate convention used in the toolbox to denote arrays is different from the

preceding paragraph in two minor ways. First, instead of using (xylem) the toolbox uses the

notation (race) to indicate rows and columns. Note, however, that the order of coordinates is

the same as the order discussed in the previous paragraph, in the sense that the first element of

a coordinate topples, (alb), refers to a row and the second to a column. The other difference is

that the origin of the coordinate system is at (r, c) = (1, 1); thus, r ranges from 1 to M and c

from 1 to N in integer increments. IPT documentation refers to the coordinates. Less frequently

the toolbox also employs another coordinate convention called spatial coordinates which uses

x to refer to columns and y to refers to rows. This is the opposite of our use of variables x and

y.

2.7 Image as Matrices:

The preceding discussion leads to the following representation for a digitized image

function:

f (0,0) f(0,1) ……….. f(0,N-1)

f(1,0) f(1,1) ………… f(1,N-1)

f(xylem)= . . .

. . .

f(M-1,0) f(M-1,1) ………… f(M-1,N-1)


The right side of this equation is a digital image by definition. Each element of this array

is called an image element, picture element, pixel or pel. The terms image and pixel are used

throughout the rest of our discussions to denote a digital image and its elements.

A digital image can be represented naturally as a MATLAB matrix:

f(1,1) f(1,2) ……. f(1,N)

f(2,1) f(2,2) …….. f(2,N) . . .

f = f(M,1) f(M,2) …….f(M,N)

Where f(1,1) = f(0,0) (note the use of a monoscope font to denote

MATLAB quantities). Clearly the two representations are identical, except for the shift in

origin. The notation f(p ,q) denotes the element located in row p and the column q. For example

f(6,2) is the element in the sixth row and second column of the matrix f. Typically we use the

letters M and N respectively to denote the number of rows and columns in a matrix. A 1xN

matrix is called a row vector whereas an Mx1 matrix is called a column vector. A 1x1 matrix

is a scalar.

Matrices in MATLAB are stored in variables with names such as A, a, RGB, real array

and so on. Variables must begin with a letter and contain only letters, numerals and

underscores. As noted in the previous paragraph, all MATLAB quantities are written using

mono-scope characters. We use conventional Roman, italic notation such as f(x ,y), for

mathematical expressions

2.8 Reading Images:

Images are read into the MATLAB environment using function imread whose syntax is
imread(‘filename’)

Format name Description recognized extension

TIFF Tagged Image File Format .tif, .ti

JPEG Joint Photograph Experts Group .jpg, .jpeg

GIF Graphics Interchange Format .gif

BMP Windows Bitmap .bmp

PNG Portable Network Graphics .png

XWD X Window Dump .xwd

Here filename is a spring containing the complete of the image file(including any applicable

extension).For example the command line

>> f = imread (‘8. jpg’);

reads the JPEG (above table) image chestxray into image array f. Note the use of single quotes

(‘) to delimit the string filename. The semicolon at the end of a command line is used by

MATLAB for suppressing output. If a semicolon is not included. MATLAB displays the

results of the operation(s) specified in that line. The prompt symbol(>>) designates the

beginning of a command line, as it appears in the MATLAB command window.

When as in the preceding command line no path is included in filename, imread reads the

file from the current directory and if that fails it tries to find the file in the MATLAB search

path. The simplest way to read an image from a specified directory is to include a full or relative

path to that directory in filename.


For example,

>> f = imread ( ‘D:\myimages\chestxray.jpg’);

reads the image from a folder called my images on the D: drive, whereas

>> f = imread(‘ . \ myimages\chestxray .jpg’);

reads the image from the my images subdirectory of the current of the current working

directory. The current directory window on the MATLAB desktop toolbar displays

MATLAB’s current working directory and provides a simple, manual way to change it.

Above table lists some of the most of the popular image/graphics formats supported by imread

and imwrite.

2.9 Data Classes

Although we work with integers coordinates the values of pixels themselves are not

restricted to be integers in MATLAB. Table above list various data classes supported by

MATLAB and IPT are representing pixels values. The first eight entries in the table are refers

to as numeric data classes. The ninth entry is the char class and, as shown, the last entry is

referred to as logical data class.

All numeric computations in MATLAB are done in double quantities, so this is also a

frequent data class encounter in image processing applications. Class unit 8 also is encountered

frequently, especially when reading data from storages devices, as 8 bit images are most

common representations found in practice. These two data classes, classes logical, and, to a

lesser degree, class unit 16 constitute the primary data classes on which we focus. Many ipt

functions however support all the data classes listed in table. Data class double requires 8 bytes
to represent a number uint8 and int 8 require one byte each, uint16 and int16 requires 2bytes

and unit 32.

2.10 Image Types

The toolbox supports four types of images:

1 .Intensity images

2. Binary images

3. Indexed images

4. R G B images

Most monochrome image processing operations are carried out using binary or intensity

images, so our initial focus is on these two image types. Indexed and RGB colour images.

2.10.1 Intensity Images:

An intensity image is a data matrix whose values have been scaled to represent intentions.

When the elements of an intensity image are of class unit8, or class unit 16, they have integer

values in the range [0,255] and [0, 65535], respectively. If the image is of class double, the

values are floating _point numbers. Values of scaled, double intensity images are in the range

[0, 1] by convention.

2.10.2 Binary Images

Binary images have a very specific meaning in MATLAB.A binary image is a logical

array 0s and1s.Thus, an array of 0s and 1s whose values are of data class, say unit8, is not

considered as a binary image in MATLAB .A numeric array is converted to binary using


function logical. Thus, if A is a numeric array consisting of 0s and 1s, we create an array B

using the statement.

B=logical (A)

If A contains elements other than 0s and 1s.Use of the logical function converts all

nonzero quantities to logical 1s and all entries with value 0 to logical 0s.

Using relational and logical operators also creates logical arrays.

To test if an array is logical we use the I logical function:

islogical(c)

If c is a logical array, this function returns a 1.Otherwise returns a 0. Logical array can be

converted to numeric arrays using the data class conversion functions.

2.10.3 Indexed Images:

An indexed image has two components:

A data matrix integer, x.

A color map matrix, map.

Matrix map is an m*3 arrays of class double containing floating_ point values in the range

[0, 1].The length m of the map are equal to the number of colors it defines. Each row of map

specifies the red, green and blue components of a single color. An indexed images uses “direct

mapping” of pixel intensity values color map values. The color of each pixel is determined by

using the corresponding value the integer matrix x as a pointer in to map. If x is of class double

,then all of its components with values less than or equal to 1 point to the first row in map, all
components with value 2 point to the second row and so on. If x is of class units or unit 16,

then all components value 0 point to the first row in map, all components with value 1 point to

the second and so on.

2.10.4 RGB Image

An RGB color image is an M*N*3 array of color pixels where each color pixel is triplet

corresponding to the red, green and blue components of an RGB image, at a specific spatial

location. An RGB image may be viewed as “stack” of three gray scale images that when fed in

to the red, green and blue inputs of a color monitor

Produce a color image on the screen. Convention the three images forming an RGB color

image are referred to as the red, green and blue components images. The data class of the

components images determines their range of values. If an RGB image is of class double the

range of values is [0, 1].

Similarly the range of values is [0,255] or [0, 65535].For RGB images of class units or

unit 16 respectively. The number of bits use to represents the pixel values of the component

images determines the bit depth of an RGB image. For example, if each component image is

an 8bit image, the corresponding RGB image is said to be 24 bits deep.

Generally, the number of bits in all component images is the same. In this case the number

of possible color in an RGB image is (2^b) ^3, where b is a number of bits in each component

image. For the 8bit case the number is 16,777,216 colors.


2.2 Applications

1. The system can be used for places that require security like bank, military etc.

2. It can also be used in houses and society to recognize the outsiders and save their identity.

3. The software can used to mark attendance based on face recognition in organizations.
CHAPTER – 3
METHODOLOGY
METHODOLOGY

In this proposed system, the system is instantiated by the mobile. After it triggers then the

system starts processing the image for which we want to mark the attendance. Image Capturing

phase is one in which we capture the image. This is basic phase from which we start initializing

our system. We capture an image from a camera which is predominantly checked for certain

constraints like lightning, spacing, density, facial expressions. The captured image is resolute

for our requirements. Once it is resolute we make sure it is either in png or jpeg format else it

is converted. We take individuals different frontal postures so that the accuracy can be attained

to the maximum extent. This is the training database in which every individual has been

classified based on labels. For the captured image, from anevery object we detect only frontal

faces from viola-jones algorithm which detects only the frontal face posture of an every

individual from the captured image. This detects only faces and removes every other parts since

we are exploring the features of only faces. These detected faces are stored in the test database

for further enquiry. Features are extracted in this extraction phase. The detected bounding

boxes are further queried to look for features extraction and the extracted features are stored in

matrix. For every detected phase this feature extraction is done. Features we look here are

Shape, Edge, Color, Wavelet, Auto-Correlation and LBP. Face is recognized once we

completed extracting features. The feature which is already trained with every individual is

compared with the detected faces feature and if both features match then it is recognised. Once,

it recognizes it is going to update in the student attendance database. Once, the process is

completed the testing images gets deleted since, we are trying to design it for both the accuracy

as well as efficiency co-efficient.


CHAPTER – 4
SYSTEM ANALYSIS
4.1 EXISTING SYSTEM
A human is made to do all of this automatically and instantaneously. Computers are incapable
of this kind of high-level generalization, so we need to teach or program each step of face
recognition separately. Face recognition systems fall into two categories: verification and
identification. Face verification is a 1:1 match that compares a face image against a template
face images, whose identity is being claimed. On the contrary, face identification is a 1: N
problem that compares a query face image.

4.2 PROPOSED SYSTEM


1. The method proposed in this paper is marking attendance using face recognition
technique. The attendance is recorded by using a camera that will stream video of
students, detect the faces in the image and compare the detected faces with the student
database and mark the attendance.

2. The project has two main parts:

1. Development of Face Recognition System.


2. Development of Attendance System.

3. Face recognition is achieved using machine learning and the basic pipeline used for it
is as follows:

1. Finds face in an image.

2. Analyses facial features.

3. Compares against known faces and makes a prediction.

4.3 MODULES

User
CHAPTER – 5
INTRODUCTION TO MAT
LAB
MATLAB is a high-performance language for technical computing. It integrates computation,

visualization, and programming in an easy-to-use environment where problems and solutions

are expressed in familiar mathematical notation. Typical uses include

Math and computation

Algorithm development

Data acquisition

Modelling, simulation, and prototyping

Data analysis, exploration, and visualization

Scientific and engineering graphics

Application development, including graphical user interface building.

MATLAB is an interactive system whose basic data element is an array that does not

require dimensioning. This allows you to solve many technical computing problems, especially

those with matrix and vector formulations, in a fraction of the time it would take to write a

program in a scalar non interactive language such as C or FORTRAN.

The name MATLAB stands for matrix laboratory. MATLAB was originally written to

provide easy access to matrix software developed by the LINPACK and EISPACK projects.

Today, MATLAB engines incorporate the LAPACK and BLAS libraries, embedding the state

of the art in software for matrix computation.

MATLAB has evolved over a period of years with input from many users. In university

environments, it is the standard instructional tool for introductory and advanced courses in
mathematics, engineering, and science. In industry, MATLAB is the tool of choice for high-

productivity research, development, and analysis.

MATLAB features a family of add-on application-specific solutions called toolboxes.

Very important to most users of MATLAB, toolboxes allow you to learned apply specialized

technology. Toolboxes are comprehensive collections of MATLAB functions (M-files) that

extend the MATLAB environment to solve particular classes of problems. Areas in which

toolboxes are available include signal processing, control systems, neural networks, fuzzy

logic, wavelets, simulation, and many others.

The MATLAB System:

The MATLAB system consists of five main parts:

Development Environment:

This is the set of tools and facilities that help you use MATLAB functions and files.

Many of these tools are graphical user interfaces. It includes the MATLAB desktop and

Command Window, a command history, an editor and debugger, and browsers for viewing

help, the workspace, files, and the search path.

The MATLAB Mathematical Function:

This is a vast collection of computational algorithms ranging from elementary functions like

sum, sine, cosine, and complex arithmetic, to more sophisticated functions like matrix inverse,

matrix eigen values, Bessel functions, and fast Fourier transforms.

The MATLAB Language:

This is a high-level matrix/array language with control flow statements, functions, data

structures, input/output, and object-oriented programming features. It allows both

"programming in the small" to rapidly create quick and dirty throw-away programs, and

"programming in the large" to create complete large and complex application programs.
Graphics:

MATLAB has extensive facilities for displaying vectors and matrices as graphs, as well

as annotating and printing these graphs. It includes high-level functions for two-dimensional

and three-dimensional data visualization, image processing, animation, and presentation

graphics. It also includes low-level functions that allow you to fully customize the appearance

of graphics as well as to build complete graphical user interfaces on your MATLAB

applications.

The MATLAB Application Program Interface (API):

This is a library that allows you to write C and Fortran programs that interact with MATLAB.

It includes facilities for calling routines from MATLAB (dynamic linking), calling MATLAB

as a computational engine, and for reading and writing MAT-files.

MATLAB working environment:

MATLAB desktoP:-

Matlab Desktop is the main Matlab application window. The desktop contains five sub

windows, the command window, the workspace browser, the current directory window, the

command history window, and one or more figure windows, which are shown only when the

user displays a graphic.

The command window is where the user types MATLAB commands and expressions at

the prompt (>>) and where the output of those commands is displayed. MATLAB defines the

workspace as the set of variables that the user creates in a work session. The workspace browser

shows these variables and some information about them. Double clicking on a variable in the
workspace browser launches the Array Editor, which can be used to obtain information and

income instances edit certain properties of the variable.

The current Directory tab above the workspace tab shows the contents of the current

directory, whose path is shown in the current directory window. For example, in the windows

operating system the path might be as follows: C:\MATLAB\Work, indicating that directory

“work” is a subdirectory of the main directory “MATLAB”; WHICH IS INSTALLED IN

DRIVE C. clicking on the arrow in the current directory window shows a list of recently used

paths. Clicking on the button to the right of the window allows the user to change the current

directory.

MATLAB uses a search path to find M-files and other MATLAB related files, which

are organize in directories in the computer file system. Any file run in MATLAB must reside

in the current directory or in a directory that is on search path. By default, the files supplied

with MATLAB and math works toolboxes are included in the search path. The easiest way to

see which directories are on the search path. The easiest way to see which directories are soon

the search paths, or to add or modify a search path, is to select set path from the File menu the

desktop, and then use the set path dialog box. It is good practice to add any commonly used

directories to the search path to avoid repeatedly having the change the current directory.

The Command History Window contains a record of the commands a user has entered in

the command window, including both current and previous MATLAB sessions. Previously

entered MATLAB commands can be selected and re-executed from the command history

window by right clicking on a command or sequence of commands.


This action launches a menu from which to select various options in addition to executing the

commands. This is useful to select various options in addition to executing the commands. This

is a useful feature when experimenting with various commands in a work session.

Using the MATLAB Editor to create M-Files:

The MATLAB editor is both a text editor specialized for creating M-files and a graphical

MATLAB debugger. The editor can appear in a window by itself, or it can be a sub window in

the desktop. M-files are denoted by the extension .m, as in pixelup.m. The MATLAB editor

window has numerous pull-down menus for tasks such as saving, viewing, and debugging files.

Because it performs some simple checks and also uses color to differentiate between various

elements of code, this text editor is recommended as the tool of choice for writing and editing

M-functions. To open the editor, type edit at the prompt opens the M-file filename.m in an

editor window, ready for editing. As noted earlier, the file must be in the current directory, or

in a directory in the search path.

Image Representation Image Format An image is a rectangular array of values (pixels). Each

pixel represents the measurement of some property of a scene measured over a finite area. The

property could be many things, but we usually measure either the average brightness (one

value) or the brightnesses of the image filtered through red, green and blue filters (three values).

The values are normally represented by an eight bit integer, giving a range of 256 levels of

brightness. We talk about the resolution of an image: this is defined by the number of pixels

and number of brightness values. A raw image will take up a lot of storage space. Methods

have been defined to compress the image by coding redundant data in a more efficient fashion,

or by discarding the perceptually less significant information. MATLAB supports reading all

of the common image formats. Image coding is not addressed in this course unit.
Image Loading and Displaying and Saving An image is loaded into working memory using the

command

>> f = imread(‘image file name’);

The semicolon at the end of the command suppresses MATLAB output. Without it, MATLAB

will execute the command and echo the results to the screen. We assign the image to the array

f. If no path is specified, MATLAB will look for the image file in the current directory. The

image can be displayed using >> imshow(f, G)

f is the image to be displayed, G defines the range of intensity levels used to display it. If it is

omitted, the default value 256 is used. If the syntax [low, high] is used instead of G, values less

than low are displayed as black, and ones greater than high are displayed as white. Finally, if

low and high are left out, i.e. use [ ], low is set to the minimum value in the image and high to

the maximum one, which is useful for automatically fixing the range of the image if it is very

small or vary large. Images are usually displayed in a figure window. If a second image is

displayed it will overwrite the first, unless the figure function is used:

>> figure, imshow(f)

will generate a new figure window and display the image in it. Note that multiple functions

may be called in a single line, provided they are separated by commas. An image array may be

written to file using:

>> imwrite (array name, ‘file name’)

The format of the file can be inferred from the file extension, or can be specified by a third

argument. Certain file formats have additional arguments. 3.3 Image Information Information

about an image file may be found by


>> imfinfo filename

4 Quantisation 4.1 Grey Level Ranges Images are normally captured with pixels in each

channel being represented by eight-bit integers. (This is partly for historical reasons – it has the

convenience of being a basic memory unit, it allows for a suitable range of values to be

represented, and many cameras could not capture data to any greater accuracy. Further, most

displays are limited to eight bits per red, green and blue channel.) But there is no reason why

pixels should be so limited, indeed, there are devices and applications that deliver and require

higher resolution data.

illumination (other wavelength ranges and more of them are possible). MATLAB provides

functions for changing images from one type to another. The syntax is

>> B = data_class_name(A)

where data_class_name is one of the data types in the above table, e.g.

>> B = uint8(A)

Number of Pixels Images come in all sizes, but are (almost) always rectangular. MATLAB

gives several methods of accessing the elements of an array, i.e. the pixels of an image. An

element can be accessed directly: typing the array name at the prompt will return all the array
elements (which could take a while), typing the array name followed by element indices in

round brackets, will return that value. Ranges of array elements can be accessed using colons.

Ranges of array elements can be accessed using colons.

>> A(first: last)

Will return the first to last elements inclusive of the one-dimensional array A. Note that the

indices start at one.

>> A(first : step : last)

Will return every step element starting from first and finishing when last is reached or

exceeded. Step could be negative, in which case you’d have to ensure that first was greater than

last. Naturally, this notation can be extended to access portions of an image. An image, f, could

be flipped using

>> fp = f(end : -1 : 1, :);

The keyword end is used to signify the last index. Using the colon alone implies that all index

values are traversed. This also indicates how multi-dimensional arrays are accessed. Or a

section of an image could be abstracted using

>> fc = f(top : bottom, left : right);

Or the image could be subsampled using

>> fs = f(1 : 2 : end, 1 : 2 : end);

A note on colour images. If the input image is colour, these operations will return greyscale

results. A colour image has three values per pixel, which are accessed using a third index.
>> A (x, y, 1:3)

Would return all three colour values of the pixel at (x,y). A colour plane could be abstracted

using

>> R = A (x, y, 1);

And similarly, for G (last index = 2) and B.

Point Processing Point processing operations manipulate individual pixel values, without

regard to any neighbouring values. Two types of transforms can be identified, manipulating

the two properties of a pixel: its value and position.

Value Manipulation The fundamental value of a pixel is its brightness (in a monochrome

image) or colour (in a multichannel image).

Pixel Scaling Scaling of pixel values is achieved by multiplying by a constant. MATLAB

provides a single function that achieves several effects

>> R = imadjust (A, [low in, high in], [low out, high out], gamma);

This takes the input range of values as specified and maps them to the output range that’s

specified. Values outside of the input range are clamped to the extremes of the output range

(values below low in are all mapped to low out). The range values are expected to be in the

interval [0, 1]. The function scales them to values appropriate to the image type before applying

the scaling operation. Whilst low in is expected to be less than high in, the same is not true for

low out and high out. The image can therefore be inverted. The value of gamma specifies the

shape of the mapped curve. Gamma = 1 gives a linear scaling, a smaller gamma gives a

mapping that expands the scale at lower values, a larger gamma expands the upper range of the
scale. This can make the contrast between darker or brighter tones more visible, respectively.

Omitting any of the parameters results in default values being assumed. The extremes of the

ranges are used (0 and 1), or gamma = 1.


CHAPTER – 6
CONCLUSION
CONCLUSION

There may be various types of lighting conditions, seating arrangements and environments in

various classrooms. Most of these conditions have been tested on the system and system has

shown 100% accuracy for most of the cases. There may also exist students portraying various

facial expressions, varying hair styles, beard, spectacles etc. All of these cases are considered

and tested to obtain a high level of accuracy and efficiency. Thus, it can be concluded from

the above discussion that a reliable, secure, fast and an efficient system has been developed

replacing a manual and unreliable system. This system can be implemented for better results

regarding the management of attendance and leaves. The system will save time, reduce the

amount of work the administration has to do and will replace the stationery material with

electronic apparatus and reduces the amount of human resource required for the purpose. Hence

a system with expected results has been developed but there is still some room for improvement
CHAPTER - 7
System Requirement Specification
7.1 Functional Requirements
System functional requirement describes activities and services that must provide.

1. Taking and tracking student attendance by facial recognition in specific time.


2. Sending the names of the absent student directly to the lecturer
3. Permitting the lecturer to modify the student absent or late.
4. Showing the names of who is absent or late in the screen to avoid errors.

7.2 Non-Functional Requirements


Non-functional Requirements are characteristics or attributes of the system that can judge its
operation. The following points clarify them:

1. Accuracy and Precision: the system should perform its process in accuracy and
Precision to avoid problems.
2. Modifiability: the system should be easy to modify, any wrong should be correct.
3. Security: the system should be secure and saving student’s privacy.
4. Usability: the system should be easy to deal with and simple to understand.
5. Maintainability: the maintenance group should be able to fix any problem occur
suddenly.
6. Speed and Responsiveness: Execution of operations should be fast.

7.3 Software Requirements

7.3.1. Software Requirements

1. Windows XP, Windows 7(ultimate, enterprise)

2. MATLAB

7.3.2. Hardware Components

1. Processor – i3

2. Hard Disk – 5 GB

3. Memory – 1GB RAM


CHAPTER – 8
SYSTEM DESIGN
The most creative and challenging face of the system development is System Design.
It provides the understanding and procedural details necessary for the logical and physical
stages of development. In designing a new system, the system analyst must have a clear
understanding of the objectives, which the design is aiming to fulfill. The first step is to
determine how the output is to be produced and in what format. Second, input data and master
files have to be designed to meet the requirements of the proposed output. The operational
phases are handled through program construction and testing.
Design of the system can be defined as a process of applying various techniques and
principles for the purpose of defining a device, a process or a system in sufficient detail to
permit its physical realization. Thus, system design is a solution to “how to” approach to the
creation of a new system. This important phase provides the understanding and the procedural
details necessary for implementing the system recommended in the feasibility study. The
design step provides a data design, architectural design, and a procedural design.

System design transforms the analysis model by: -


1.Defining the design goals of the project
2.Decomposing the system into smaller subsystems
3.Selection of off-the- shelf and legacy components
4.Mapping subsystems to hardware
5.Selection of persistent data management infrastructure
6.Selection of access control policy
7.Selection of global control flow mechanism
8.Handling of boundary conditions

8.1 UML Modeling


UML stands for Unified Modelling Language. UML is a standardized general-purpose
modelling language in the field of object-oriented software engineering. The standard is
managed, and was created by, the Object Management Group.
The goal is for UML to become a common language for creating models of object-
oriented computer software. In its current form UML is comprised of two major components a
Meta-model and a notation. In the future, some form of method or process may also be added
to; or associated with, UML.
The Unified Modelling Language is a standard language for specifying, Visualization,
Constructing and documenting the artefacts of software system, as well as for business
modelling and other non-software systems.
The UML represents a collection of best engineering practices that have proven
successful in the modelling of large and complex systems.
The UML is a very important part of developing objects-oriented software and the
software development process. The UML uses mostly graphical notations to express the design
of software projects.

8.1.1 Importance of UML in Modelling


The Primary goals in the design of the UML are as follows:
1. Provide users a ready-to-use, expressive visual modelling Language so that they can
develop and exchange meaningful models.
2. Provide extendibility and specialization mechanisms to extend the core concepts.
3. Be independent of particular programming languages and development process.
4. Provide a formal basis for understanding the modelling language.
5. Encourage the growth of OO tools market.
6. Support higher level development concepts such as collaborations, frameworks, patterns
and components.
7. Integrate best practices.

8.2 Class Diagram

8.3 Use case Diagram


8.4 Sequence Diagram

8.5 Component diagram


CHAPTER – 9
CODING
9.1 Sample Code
function varargout = main(varargin)
% MAIN MATLAB code for main.fig
% MAIN, by itself, creates a new MAIN or raises the existing
% singleton*.
%
% H = MAIN returns the handle to a new MAIN or the handle to
% the existing singleton*.
%
% MAIN('CALLBACK',hObject,eventData,handles,...) calls the local
% function named CALLBACK in MAIN.M with the given input arguments.
%
% MAIN('Property','Value',...) creates a new MAIN or raises the
% existing singleton*. Starting from the left, property value pairs
are
% applied to the GUI before main_OpeningFcn gets called. An
% unrecognized property name or invalid value makes property
application
% stop. All inputs are passed to main_OpeningFcn via varargin.
%
% *See GUI Options on GUIDE's Tools menu. Choose "GUI allows only one
% instance to run (singleton)".
%
% See also: GUIDE, GUIDATA, GUIHANDLES

% Edit the above text to modify the response to help main

% Last Modified by GUIDE v2.5 11-May-2015 19:47:52

% Begin initialization code - DO NOT EDIT


gui_Singleton = 1;
gui_State = struct('gui_Name', mfilename, ...
'gui_Singleton', gui_Singleton, ...
'gui_OpeningFcn', @main_OpeningFcn, ...
'gui_OutputFcn', @main_OutputFcn, ...
'gui_LayoutFcn', [] , ...
'gui_Callback', []);
if nargin && ischar(varargin{1})
gui_State.gui_Callback = str2func(varargin{1});
end

if nargout
[varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT

% --- Executes just before main is made visible.


function main_OpeningFcn(hObject, eventdata, handles, varargin)
% This function has no output args, see OutputFcn.
% hObject handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% varargin command line arguments to main (see VARARGIN)

% Choose default command line output for main


handles.output = hObject;

% Update handles structure


guidata(hObject, handles);

% UIWAIT makes main wait for user response (see UIRESUME)


% uiwait(handles.figure1);

% --- Outputs from this function are returned to the command line.
function varargout = main_OutputFcn(hObject, eventdata, handles)
% varargout cell array for returning output args (see VARARGOUT);
% hObject handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% standard size of image is 300 *300


global co
clc
warning off
st = version;
if str2double(st(1)) < 8
beep
hx = msgbox('PLEASE RUN IT ON MATLAB 2013 or
Higher','INFO...!!!','warn','modal');
pause(3)
delete(hx)
close(gcf)
return
end
co = get(hObject,'color');
addpath(pwd,'database','codes')
if size(ls('database'),2) == 2
delete('features.mat');
delete('info.mat');
end
% Get default command line output from handles structure
varargout{1} = handles.output;

function edit1_Callback(hObject, eventdata, handles)


% hObject handle to edit1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit1 as text


% str2double(get(hObject,'String')) returns contents of edit1 as a
double

% --- Executes during object creation, after setting all properties.


function edit1_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

% --- Executes on button press in pushbutton1.


function pushbutton1_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
p = get(handles.edit1,'UserData');
if strcmp(p,'123') == 1
delete(hObject);
delete(handles.pushbutton2)
delete(handles.edit1);
delete(handles.text2);
delete(handles.text3);
% delete(handles.text1);
%delete(handles.text4);
% msgbox('WHY DONT U READ HELP BEFORE
STARTING','HELP....!!!','help','modal')
set(handles.AD_NW_IMAGE,'enable','on')
set(handles.DE_LETE,'enable','on')
set(handles.TRAIN_ING,'enable','on')
set(handles.STA_RT,'enable','on')
set(handles.RESET_ALL,'enable','on')
set(handles.EXI_T,'enable','on')
set(handles.HE_LP,'enable','on')
set(handles.DATA_BASE,'enable','on')
set(handles.text5,'visible','on')
else
msgbox('INVALID PASSWORD FRIEND... XX','WARNING....!!!','warn','modal')
end

% --- Executes on button press in pushbutton2.


function pushbutton2_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton2 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
close gcf

% --------------------------------------------------------------------
function AD_NW_IMAGE_Callback(hObject, eventdata, handles)
% hObject handle to AD_NW_IMAGE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --------------------------------------------------------------------
function DE_LETE_Callback(hObject, eventdata, handles)
% hObject handle to DE_LETE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --------------------------------------------------------------------
function TRAIN_ING_Callback(hObject, eventdata, handles)
% hObject handle to TRAIN_ING (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --------------------------------------------------------------------
function STA_RT_Callback(hObject, eventdata, handles)
% hObject handle to STA_RT (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --------------------------------------------------------------------
function DATA_BASE_Callback(hObject, eventdata, handles)
% hObject handle to DATA_BASE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --------------------------------------------------------------------
function RESET_ALL_Callback(hObject, eventdata, handles)
% hObject handle to RESET_ALL (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --------------------------------------------------------------------
function EXI_T_Callback(hObject, eventdata, handles)
% hObject handle to EXI_T (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --------------------------------------------------------------------
function HE_LP_Callback(hObject, eventdata, handles)
% hObject handle to HE_LP (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --------------------------------------------------------------------
function READ_ME_Callback(hObject, eventdata, handles)
% hObject handle to READ_ME (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
%winopen('help.pdf')

% --------------------------------------------------------------------
function PRE_CAP_Callback(hObject, eventdata, handles)
% hObject handle to PRE_CAP (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
if exist('features.mat','file') == 0
msgbox('FIRST TRAIN YOUR DATABASE','INFO...!!!','MODAL')
return
end
ff = dir('database');
if length(ff) == 2
h = waitbar(0,'Plz wait Matlab is scanning ur
database...','name','SCANNING IS IN PROGRESS');
for k = 1:100
waitbar(k/100)
pause(0.03)
end
close(h)
msgbox({'NO IMAGE FOUND IN DATABASE';'FIRST LOAD YOUR DATABASE';'USE
''ADD NEW IMAGE'' MENU'},'WARNING....!!!','WARN','MODAL')
return
end
fd = vision.CascadeObjectDetector();
[f,p] = uigetfile('*.jpg','PLEASE SELECT AN FACIAL IMAGE');
if f == 0
return
end
p1 = fullfile(p,f);
im = imread(p1);
bbox = step(fd, im);
vo = insertObjectAnnotation(im,'rectangle',bbox,'FACE');
r = size(bbox,1);
if isempty(bbox)
axes(handles.axes1)
imshow(vo);
msgbox({'NO FACE IN THIS PIC';'PLEASE SELECT SINGLE FACE
IMAGE'},'WARNING...!!!','warn','modal')
uiwait
cla(handles.axes1); reset(handles.axes1);
set(handles.axes1,'box','on','xtick',[],'ytick',[])
return
elseif r > 1
axes(handles.axes1)
imshow(vo);
msgbox({'TOO MANY FACES IN THIS PIC';'PLEASE SELECT SINGLE FACE
IMAGE'},'WARNING...!!!','warn','modal')
uiwait
cla(handles.axes1); reset(handles.axes1);
set(handles.axes1,'box','on','xtick',[],'ytick',[])
returnf
end
axes(handles.axes1)
image(vo);
set(handles.axes1,'xtick',[],'ytick',[],'box','on')
bx = questdlg({'CORRECT IMAGE IS SELECTED';'SELECT OPTION FOR FACE
EXTRACTION'},'SELECT AN OPTION','MANUALLY','AUTO','CC');
if strcmp(bx,'MANUALLY') == 1
while 1
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imc = imcrop(im);
bbox1 = step(fd, imc);
if size(bbox1,1) ~= 1
msgbox({'YOU HAVENT CROPED A FACE';'CROP AGAIN'},'BAD
ACTION','warn','modal')
uiwait
else
close gcf
break
end
close gcf
end
imc = imresize(imc,[300 300]);
image(imc)
text(20,20,'\bfUr Precaptured
image.','fontsize',12,'color','y','fontname','comic sans ms')
set(handles.axes1,'xtick',[],'ytick',[],'box','on')
end
if strcmp(bx,'AUTO') == 1
imc = imcrop(im,[bbox(1)-50 bbox(2)-250 bbox(3)+100
bbox(4)+400]);
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imshow(imc)
qx = questdlg({'ARE YOU SATISFIED WITH THE RESULTS?';' ';'IF YES THEN
PROCEED';' ';'IF NOT BETTER DO MANUAL
CROPING'},'SELECT','PROCEED','MANUAL','CC');
if strcmpi(qx,'proceed') == 1
close gcf
imc = imresize(imc,[300 300]);
axes(handles.axes1)
image(imc)
text(20,20,'\bfUr Precaptured
image.','fontsize',12,'color','y','fontname','comic sans ms')
set(handles.axes1,'xtick',[],'ytick',[],'box','on')
elseif strcmpi(qx,'manual') == 1
while 1
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imc = imcrop(im);
bbox1 = step(fd, imc);
if size(bbox1,1) ~= 1
msgbox({'YOU HAVENT CROPED A FACE';'CROP AGAIN'},'BAD
ACTION','warn','modal')
uiwait
else
break
end
close gcf
end
close gcf
imc = imresize(imc,[300 300]);
axes(handles.axes1)
image(imc)
text(20,20,'\bfUr Precaptured
image.','fontsize',12,'color','y','fontname','comic sans ms')
set(handles.axes1,'xtick',[],'ytick',[],'box','on')
else
end
end
immxx = getimage(handles.axes1);
zz = findsimilar(immxx);
zz = strtrim(zz);
fxz = imread(['database/' zz]);
q1= ehd(immxx,0.1);
q2 = ehd(fxz,0.1);
q3 = pdist([q1 ; q2]);
disp(q3)
if q3 < 0.5
axes(handles.axes2)
image(fxz)
set(handles.axes1,'xtick',[],'ytick',[],'box','on')
text(20,20,'\bfUr Database Entered
Image.','fontsize',12,'color','y','fontname','comic sans ms')
set(handles.axes2,'xtick',[],'ytick',[],'box','on')
xs = load('info.mat');
xs1 = xs.z2;
for k = 1:length(xs1)
st = xs1{k};
stx = st{1};
if strcmp(stx,zz) == 1
str = st{2};
break
end
end
fid = fopen('attendence_sheet.txt','a');
fprintf(fid,'%s %s %s
%s\r\n\n', 'Name','Date','Time', 'Attendence');
c = clock;
if c(4) > 12
s = [num2str(c(4)-12) ,':',num2str(c(5)), ':', num2str(round(c(6)))
];
else
s = [num2str(c(4)) ,':',num2str(c(5)), ':', num2str(round(c(6))) ];
end
fprintf(fid,'%s %s %s %s\r\n\n',
str, date,s,'Present');
fclose(fid);
set(handles.text5,'string',['Hello ' str ' ,Your attendence has been
Marked.'])
try
s = serial('com22');
fopen(s);
fwrite(s,'A');
pause(1)
fclose(s);
clear s
catch
% msgbox({'PLZ CONNECT CABLE OR';'INVALID COM PORT
SELECTED'},'WARNING','WARN','MODAL')
uiwait
delete(s)
clear s
end
else
msgbox('YOU ARE NOT A VALID PERSON', 'WARNING','WARN','MODAL')
cla(handles.axes1)
reset(handles.axes1)
cla(handles.axes2)
reset(handles.axes2)

set(handles.axes1,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5);

set(handles.axes2,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
end

% --------------------------------------------------------------------
function LIVE_CAM_Callback(hObject, eventdata, handles)
% hObject handle to LIVE_CAM (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
global co
if exist('features.mat','file') == 0
msgbox('FIRST TRAIN YOUR DATABASE','INFO...!!!','MODAL')
return
end
ff = dir('database');
if length(ff) == 2
h = waitbar(0,'Plz wait Matlab is scanning ur
database...','name','SCANNING IS IN PROGRESS');
for k = 1:100
waitbar(k/100)
pause(0.03)
end
close(h)
msgbox({'NO IMAGE FOUND IN DATABASE';'FIRST LOAD YOUR DATABASE';'USE
''ADD NEW IMAGE'' MENU'},'WARNING....!!!','WARN','MODAL')
return
end

if isfield(handles,'vdx')
vid = handles.vdx;
stoppreview(vid)
delete(vid)
handles = rmfield(handles,'vdx');
guidata(hObject,handles)
cla(handles.axes1)
reset(handles.axes1)

set(handles.axes1,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
cla(handles.axes2)
reset(handles.axes2)

set(handles.axes2,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
end
info = imaqhwinfo('winvideo');
did = info.DeviceIDs;
if isempty(did)
msgbox({'YOUR SYSTEM DO NOT HAVE A WEBCAM';' ';'CONNECT A
ONE'},'WARNING....!!!!','warn','modal')
return
end
fd = vision.CascadeObjectDetector();
did = cell2mat(did);
for k = 1:length(did)
devinfo = imaqhwinfo('winvideo',k);
na(1,k) = {devinfo.DeviceName};
sr(1,k) = {devinfo.SupportedFormats};
end
[a,b] = listdlg('promptstring','SELECT A WEB CAM
DEVICE','liststring',na,'ListSize', [125, 75],'SelectionMode','single');
if b == 0
return
end
if b ~= 0
frmt = sr{1,a};
[a1,b1] = listdlg('promptstring','SELECT
RESOLUTION','liststring',frmt,'ListSize', [150,
100],'SelectionMode','single');
if b1 == 0
return
end
end
frmt = frmt{a1};
l = find(frmt == '_');
res = frmt(l+1 : end);
l = find(res == 'x');
res1 = str2double(res(1: l-1));
res2 = str2double(res(l+1 : end));
axes(handles.axes1)
vid = videoinput('winvideo', a);
vr = [res1 res2];
nbands = get(vid,'NumberofBands');
h2im = image(zeros([vr(2) vr(1) nbands] , 'uint8'));
preview(vid,h2im);
handles.vdx = vid;
guidata(hObject,handles)
tx = msgbox('PLZ STAND IN FRONT OF CAMERA STILL','INFO......!!!');
pause(1)
delete(tx)
kx = 0;
while 1
im = getframe(handles.axes1);
im = im.cdata;
bbox = step(fd, im);
vo = insertObjectAnnotation(im,'rectangle',bbox,'FACE');
axes(handles.axes2)
imshow(vo)
if size(bbox,1) > 1
msgbox({'TOO MANY FACES IN FRAME';' ';'ONLY ONE FACE IS
ACCEPTED'},'WARNING.....!!!','warn','modal')
uiwait
stoppreview(vid)
delete(vid)
handles = rmfield(handles,'vdx');
guidata(hObject,handles)
cla(handles.axes1)
reset(handles.axes1)
set(handles.axes1,'box','on','xtick',[],'ytick',[],'xcolor',[1 1
1],'ycolor',[1 1 1],'color',co,'linewidth',1.5)
cla(handles.axes2)
reset(handles.axes2)
set(handles.axes2,'box','on','xtick',[],'ytick',[],'xcolor',[1 1
1],'ycolor',[1 1 1],'color',co,'linewidth',1.5)
return
end
kx = kx + 1;
if kx > 10 && ~isempty(bbox)
break
end
end
imc = imcrop(im,[bbox(1)+3 bbox(2)-35 bbox(3)-10 bbox(4)+70]);
imx = imresize(imc,[300 300]);
axes(handles.axes1)
image(imx)
text(20,20,'\bfUr Current
image.','fontsize',12,'color','y','fontname','comic sans ms')
set(handles.axes1,'xtick',[],'ytick',[],'box','on')
immxx = imx;
zz = findsimilar(immxx);
zz = strtrim(zz);
fxz = imread(['database/' zz]);
q1= ehd(immxx,0.1);
q2 = ehd(fxz,0.1);
q3 = pdist([q1 ; q2]);
disp(q3)
if q3 < 0.5
axes(handles.axes2)
image(fxz)
set(handles.axes1,'xtick',[],'ytick',[],'box','on')
text(20,20,'\bfUr Database Entered
Image.','fontsize',12,'color','y','fontname','comic sans ms')
set(handles.axes2,'xtick',[],'ytick',[],'box','on')
xs = load('info.mat');
xs1 = xs.z2;
for k = 1:length(xs1)
st = xs1{k};
stx = st{1};
if strcmp(stx,zz) == 1
str = st{2};
break
end
end
fid = fopen('attendence_sheet.txt','a');
fprintf(fid,'%s %s %s
%s\r\n\n', 'Name','Date','Time', 'Attendence');
c = clock;
if c(4) > 12
s = [num2str(c(4)-12) ,':',num2str(c(5)), ':', num2str(round(c(6)))
];
else
s = [num2str(c(4)) ,':',num2str(c(5)), ':', num2str(round(c(6))) ];
end
fprintf(fid,'%s %s %s %s\r\n\n',
str, date,s,'Present');
fclose(fid);
set(handles.text5,'string',['Hello ' str ' ,Your attendence has been
Marked.'])
try
s = serial('com22');
fopen(s);
fwrite(s,'A');
pause(1)
fclose(s);
clear s
catch
msgbox({'PLZ CONNECT CABLE OR';'INVALID COM PORT
SELECTED'},'WARNING','WARN','MODAL')
uiwait
delete(s)
clear s
end
else
msgbox('YOU ARE NOT A VALID PERSON', 'WARNING','WARN','MODAL')
cla(handles.axes1)
reset(handles.axes1)
cla(handles.axes2)
reset(handles.axes2)

set(handles.axes1,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5);

set(handles.axes2,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
end
% --------------------------------------------------------------------
function SINGL_PIC_Callback(hObject, eventdata, handles)
% hObject handle to SINGL_PIC (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
flist = dir('database');
if length(flist) == 2
msgbox('NOTHING TO DELETE','INFO','modal');
return
end
cd('database')
[f,p] = uigetfile('*.jpg','SELECT A PIC TO DELETE IT');
if f == 0
cd ..
return
end
p1 = fullfile(p,f);
delete(p1)
flist = dir(pwd);
if length(flist) == 2
cd ..
return
end
for k = 3:length(flist)
z = flist(k).name;
z(strfind(z,'.') : end) = [];
nlist(k-2) = str2double(z);
end
nlist = sort(nlist);
h = waitbar(0,'PLZ WAIT, WHILE MATLAB IS RENAMING','name','PROGRESS...');
for k = 1:length(nlist)
if k ~= nlist(k)
p = nlist(k);
movefile([num2str(p) '.jpg'] , [num2str(k) '.jpg'])
waitbar((k-2)/length(flist),h,sprintf('RENAMED %s to
%s',[num2str(p) '.jpg'],[num2str(k) '.jpg']))
end
pause(.5)
end
close(h)
cd ..

% --------------------------------------------------------------------
function MULTI_PIC_Callback(hObject, eventdata, handles)
% hObject handle to MULTI_PIC (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
flist = dir('database');
if length(flist) == 2
msgbox('NOTHING TO DELETE','INFO','modal');
return
end
for k = 3:length(flist)
na1(k-2,1) = {flist(k).name};
end
[a,b] = listdlg('promptstring','SELECT FILE/FILES TO
DELETE','liststring',na1,'listsize',[125 100]);
if b == 0
return
end
cd ('database')
for k = 1:length(a)
str = na1{k};
delete(str)
end
cd ..
flist = dir('database');
if length(flist) == 2
msgbox({'NOTHING TO RENAME';'ALL DELETED'},'INFO','modal');
return
end
cd('database')
flist = dir(pwd);
for k = 3:length(flist)
z = flist(k).name;
z(strfind(z,'.') : end) = [];
nlist(k-2) = str2double(z);
end
nlist = sort(nlist);
h = waitbar(0,'PLZ WAIT, WHILE MATLAB IS RENAMING','name','PROGRESS...');
for k = 1:length(nlist)
if k ~= nlist(k)
p = nlist(k);
movefile([num2str(p) '.jpg'] , [num2str(k) '.jpg'])
waitbar((k-2)/length(flist),h,sprintf('RENAMED %s to
%s',[num2str(p) '.jpg'],[num2str(k) '.jpg']))
end
pause(.5)
end
close(h)
cd ..

% --------------------------------------------------------------------
function BR_OWSE_Callback(hObject, eventdata, handles)
% hObject handle to BR_OWSE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
[f,p] = uigetfile('*.jpg','PLEASE SELECT AN FACIAL IMAGE');
if f == 0
return
end
p1 = fullfile(p,f);
im = imread(p1);
fd = vision.CascadeObjectDetector();
bbox = step(fd, im);
vo = insertObjectAnnotation(im,'rectangle',bbox,'FACE');
r = size(bbox,1);
if isempty(bbox)
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imshow(vo);
msgbox({'WHAT HAVE U CHOOSEN?';'NO FACE FOUND IN THIS PIC,';'SELECT
SINGLE FACE IMAGE.'},'WARNING...!!!','warn','modal')
uiwait
delete(fhx)
return
elseif r > 1
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imshow(vo);
msgbox({'TOO MANY FACES IN THIS PIC';'PLEASE SELECT SINGLE FACE
IMAGE'},'WARNING...!!!','warn','modal')
uiwait
delete(fhx)
return
end
bx = questdlg({'CORRECT IMAGE IS SELECTED';'SELECT OPTION FOR FACE
EXTRACTION'},'SELECT AN OPTION','MANUALLY','AUTO','CC');
if strcmp(bx,'MANUALLY') == 1
while 1
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imc = imcrop(im);
bbox1 = step(fd, imc);
if size(bbox1,1) ~= 1
msgbox({'YOU HAVENT CROPED A FACE';'CROP AGAIN'},'BAD
ACTION','warn','modal')
uiwait
else
break
end
close gcf
end
close gcf
imc = imresize(imc,[300 300]);
cd ('database');
l = length(dir(pwd));
n = [int2str(l-1) '.jpg'];
imwrite(imc,n);
cd ..
while 1
qq = inputdlg('WHAT IS UR NAME?','FILL');
if isempty(qq)
msgbox({'YOU HAVE TO ENTER A NAME';' ';'YOU CANT CLICK
CANCEL'},'INFO','HELP','MODAL')
uiwait
else
break
end
end
qq = qq{1};
if exist('info.mat','file') == 2
load ('info.mat')
r = size(z2,1);
z2{r+1,1} = {n , qq};
save('info.mat','z2')
else
z2{1,1} = {n,qq};
save('info.mat','z2')
end
end
if strcmp(bx,'AUTO') == 1
imc = imcrop(im,[bbox(1)-50 bbox(2)-250 bbox(3)+100
bbox(4)+400]);
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imshow(imc)
qx = questdlg({'ARE YOU SATISFIED WITH THE RESULTS?';' ';'IF YES THEN
PROCEED';' ';'IF NOT BETTER DO MANUAL
CROPING'},'SELECT','PROCEED','MANUAL','CC');
if strcmpi(qx,'proceed') == 1
imc = imresize(imc,[300 300]);
cd ('database');
l = length(dir(pwd));
n = [int2str(l-1) '.jpg'];
imwrite(imc,n);
cd ..
while 1
qq = inputdlg('WHAT IS UR NAME?','FILL');
if isempty(qq)
msgbox({'YOU HAVE TO ENTER A NAME';' ';'YOU CANT CLICK
CANCEL'},'INFO','HELP','MODAL')
uiwait
else
break
end
end
qq = qq{1};
if exist('info.mat','file') == 2
load ('info.mat')
r = size(z2,1);
z2{r+1,1} = {n , qq};
save('info.mat','z2')
else
z2{1,1} = {n,qq};
save('info.mat','z2')
end
close gcf
elseif strcmpi(qx,'manual') == 1
while 1
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imc = imcrop(im);
bbox1 = step(fd, imc);
if size(bbox1,1) ~= 1
msgbox({'YOU HAVENT CROPED A FACE';'CROP AGAIN'},'BAD
ACTION','warn','modal')
uiwait
else
break
end
close gcf
end
close gcf
imc = imresize(imc,[300 300]);
cd ('database');
l = length(dir(pwd));
n = [int2str(l-1) '.jpg'];
imwrite(imc,n);
cd ..
while 1
qq = inputdlg('WHAT IS UR NAME?','FILL');
if isempty(qq)
msgbox({'YOU HAVE TO ENTER A NAME';' ';'YOU CANT CLICK
CANCEL'},'INFO','HELP','MODAL')
uiwait
else
break
end
end
qq = qq{1};
if exist('info.mat','file') == 2
load ('info.mat')
r = size(z2,1);
z2{r+1,1} = {n , qq};
save('info.mat','z2')
else
z2{1,1} = {n,qq};
save('info.mat','z2')
end
else
return
end
end

% --------------------------------------------------------------------
function FRM_CAM_Callback(hObject, eventdata, handles)
% hObject handle to FRM_CAM (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
global co
if isfield(handles,'vdx')
vid = handles.vdx;
stoppreview(vid)
delete(vid)
handles = rmfield(handles,'vdx');
guidata(hObject,handles)
cla(handles.axes1)
reset(handles.axes1)

set(handles.axes1,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
cla(handles.axes2)
reset(handles.axes2)

set(handles.axes2,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
end
fd = vision.CascadeObjectDetector();
info = imaqhwinfo('winvideo');
did = info.DeviceIDs;
if isempty(did)
msgbox({'YOUR SYSTEM DO NOT HAVE A WEBCAM';' ';'CONNECT A
ONE'},'WARNING....!!!!','warn','modal')
return
end
did = cell2mat(did);
for k = 1:length(did)
devinfo = imaqhwinfo('winvideo',k);
na(1,k) = {devinfo.DeviceName};
sr(1,k) = {devinfo.SupportedFormats};
end
[a,b] = listdlg('promptstring','SELECT A WEB CAM
DEVICE','liststring',na,'ListSize', [125, 75],'SelectionMode','single');
if b == 0
return
end
if b ~= 0
frmt = sr{1,a};
[a1,b1] = listdlg('promptstring','SELECT
RESOLUTION','liststring',frmt,'ListSize', [150,
100],'SelectionMode','single');
if b1 == 0
return
end
end
frmt = frmt{a1};
l = find(frmt == '_');
res = frmt(l+1 : end);
l = find(res == 'x');
res1 = str2double(res(1: l-1));
res2 = str2double(res(l+1 : end));
axes(handles.axes1)
vid = videoinput('winvideo', a);
vr = [res1 res2];
nbands = get(vid,'NumberofBands');
h2im = image(zeros([vr(2) vr(1) nbands] , 'uint8'));
preview(vid,h2im);
handles.vdx = vid;
guidata(hObject,handles)
tx = msgbox('PLZ STAND IN FRONT OF CAMERA STILL','INFO......!!!');
pause(1)
delete(tx)
kx = 0;
while 1
im = getframe(handles.axes1);
im = im.cdata;
bbox = step(fd, im);
vo = insertObjectAnnotation(im,'rectangle',bbox,'FACE');
axes(handles.axes2)
imshow(vo)
if size(bbox,1) > 1
msgbox({'TOO MANY FACES IN FRAME';' ';'ONLY ONE FACE IS
ACCEPTED'},'WARNING.....!!!','warn','modal')
uiwait
stoppreview(vid)
delete(vid)
handles = rmfield(handles,'vdx');
guidata(hObject,handles)
cla(handles.axes1)
reset(handles.axes1)
set(handles.axes1,'box','on','xtick',[],'ytick',[],'xcolor',[1 1
1],'ycolor',[1 1 1],'color',co,'linewidth',1.5)
cla(handles.axes2)
reset(handles.axes2)
set(handles.axes2,'box','on','xtick',[],'ytick',[],'xcolor',[1 1
1],'ycolor',[1 1 1],'color',co,'linewidth',1.5)
return
end
kx = kx + 1;
if kx > 10 && ~isempty(bbox)
break
end
end
imc = imcrop(im,[bbox(1)+3 bbox(2)-35 bbox(3)-10 bbox(4)+70]);
imx = imresize(imc,[300 300]);
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imshow(imx)
cd ('database');
l = length(dir(pwd));
n = [int2str(l-1) '.jpg'];
imwrite(imx,n);
cd ..
while 1
qq = inputdlg('WHAT IS UR NAME?','FILL');
if isempty(qq)
msgbox({'YOU HAVE TO ENTER A NAME';' ';'YOU CANT CLICK
CANCEL'},'INFO','HELP','MODAL')
uiwait
else
break
end
end
qq = qq{1};
if exist('info.mat','file') == 2
load ('info.mat')
r = size(z2,1);
z2{r+1,1} = {n , qq};
save('info.mat','z2')
else
z2{1,1} = {n,qq};
save('info.mat','z2')
end
close gcf
stoppreview(vid)
delete(vid)
handles = rmfield(handles,'vdx');
guidata(hObject,handles)
cla(handles.axes1)
reset(handles.axes1)
set(handles.axes1,'box','on','xtick',[],'ytick',[],'xcolor',[1 1
1],'ycolor',[1 1 1],'color',co,'linewidth',1.5)
cla(handles.axes2)
reset(handles.axes2)
set(handles.axes2,'box','on','xtick',[],'ytick',[],'xcolor',[1 1
1],'ycolor',[1 1 1],'color',co,'linewidth',1.5)

% --- Executes on key press with focus on edit1 and none of its controls.
function edit1_KeyPressFcn(hObject, eventdata, handles)
% hObject handle to edit1 (see GCBO)
% eventdata structure with the following fields (see UICONTROL)
% Key: name of the key that was pressed, in lower case
% Character: character interpretation of the key(s) that was pressed
% Modifier: name(s) of the modifier key(s) (i.e., control, shift) pressed
% handles structure with handles and user data (see GUIDATA)
pass = get(handles.edit1,'UserData');
v = double(get(handles.figure1,'CurrentCharacter'));
if v == 8
pass = pass(1:end-1);
set(handles.edit1,'string',pass)
elseif any(v == 65:90) || any(v == 97:122) || any(v == 48:57)
pass = [pass char(v)];
elseif v == 13
p = get(handles.edit1,'UserData');
if strcmp(p,'123') == true
delete(hObject);
delete(handles.pushbutton2)
delete(handles.pushbutton1);
delete(handles.text2);
delete(handles.text3);
% delete(handles.text1);
delete(handles.text4);
msgbox('WHY DONT U READ HELP BEFORE
STARTING','HELP....!!!','help','modal')
set(handles.AD_NW_IMAGE,'enable','on')
set(handles.DE_LETE,'enable','on')
set(handles.TRAIN_ING,'enable','on')
set(handles.STA_RT,'enable','on')
set(handles.RESET_ALL,'enable','on')
set(handles.EXI_T,'enable','on')
set(handles.HE_LP,'enable','on')
set(handles.DATA_BASE,'enable','on')
set(handles.text5,'visible','on')
return
else
beep
msgbox('INVALID PASSWORD FRIEND...
XX','WARNING....!!!','warn','modal')
uiwait;
set(handles.edit1,'string','')
return
end
else
msgbox({'Invalid Password Character';'Can''t use Special
Character'},'warn','modal')
uiwait;
set(handles.edit1,'string','')
return
end
set(handles.edit1,'UserData',pass)
set(handles.edit1,'String',char('*'*sign(pass)))

% --------------------------------------------------------------------
function VI_EW_Callback(hObject, eventdata, handles)
% hObject handle to VI_EW (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

f = dir('database');
if length(f) == 2
msgbox('YOUR DATA BASE HAS NO IMAGE TO DISPLAY','SORRY','modal')
return
end
l = length(f)-2;
while 1
a = factor(l);
if length(a) >= 4
break
end
l = l+1;
end
d = a(1: ceil(length(a)/2));
d = prod(d);
d1 = a(ceil(length(a)/2)+1 : end);
d1 = prod(d1);
zx = sort([d d1]);
figure('menubar','none','numbertitle','off','name','Images of
Database','color',[0.0431 0.5176 0.7804],'position',[300 200 600 500])
for k = 3:length(f)
im = imread(f(k).name);
subplot(zx(1),zx(2),k-2)
imshow(im)
title(f(k).name,'fontsize',10,'color','w')
end

% --------------------------------------------------------------------
function Start_Training_Callback(hObject, eventdata, handles)
% hObject handle to Start_Training (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
ff = dir('database');
if length(ff) == 2
h = waitbar(0,'Plz wait Matlab is scanning ur
database...','name','SCANNING IS IN PROGRESS');
for k = 1:100
waitbar(k/100)
pause(0.03)
end
close(h)
msgbox({'NO IMAGE FOUND IN DATABASE';'FIRST LOAD YOUR DATABASE';'USE
''ADD NEW IMAGE'' MENU'},'WARNING....!!!','WARN','MODAL')
return
end
if exist('features.mat','file') == 2
bx = questdlg({'TRAINING HAS ALREDY BEEN DONE';' ';'WANT TO TRAIN
DATABASE AGAIN?'},'SELECT','YES','NO','CC');
if strcmpi(bx,'yes') == 1
builddatabase
msgbox('TRAINING DONE....PRESS OK TO CONTINUE','OK','modal')
return
else
return
end
else
builddatabase
msgbox('TRAINING DONE....PRESS OK TO CONTINUE','OK','modal')
return
end

% --------------------------------------------------------------------
function BYE_Callback(hObject, eventdata, handles)
% hObject handle to BYE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
close gcf

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%end%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%

% --------------------------------------------------------------------
function ATTENDENCE_Callback(hObject, eventdata, handles)
% hObject handle to ATTENDENCE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
if exist('attendence_sheet.txt','file') == 2
winopen('attendence_sheet.txt')
else
msgbox('NO ATTENDENCE SHEET TO DISPLAY','INFO...!!!','HELP','MODAL')
end

% --------------------------------------------------------------------
function DEL_ATTENDENCE_Callback(hObject, eventdata, handles)
% hObject handle to DEL_ATTENDENCE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
if exist('attendence_sheet.txt','file') == 2
delete('attendence_sheet.txt')
msgbox('ATTENDENCE DELETED','INFO...!!!','MODAL')
else
msgbox('NO ATTENDENCE SHEET TO DELETE','INFO...!!!','HELP','MODAL')
end

% --------------------------------------------------------------------
function Untitled_1_Callback(hObject, eventdata, handles)
% hObject handle to Untitled_1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
x = questdlg({'Resetting will Clear the followings: ';'1.
Attendence_sheet';'2. Database';'3. features.mat';'4. Info.mat';'Do u want
to continue?'},'Please select...!!');
if strcmpi(x,'yes') == 1
delete('attendence_sheet.txt')
delete('features.mat')
delete('info.mat')
cd ([pwd, '\database'])
f = dir(pwd);
for k = 1:length(f)
delete(f(k).name)
end
cd ..
cla(handles.axes1);
reset(handles.axes1);

set(handles.axes1,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
cla(handles.axes2);
reset(handles.axes2);

set(handles.axes2,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
set(handles.text5,'string','')
beep
msgbox('All Reset','Info','modal')
end

% --------------------------------------------------------------------
function Untitled_2_Callback(hObject, eventdata, handles)
% hObject handle to Untitled_2 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
cla(handles.axes1);
reset(handles.axes1);
set(handles.axes1,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
cla(handles.axes2);
reset(handles.axes2);
set(handles.axes2,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
set(handles.text5,'string','')

% --------------------------------------------------------------------
function Untitled_3_Callback(hObject, eventdata, handles)
% hObject handle to Untitled_3 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --------------------------------------------------------------------
function Untitled_4_Callback(hObject, eventdata, handles)
% hObject handle to Untitled_4 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --------------------------------------------------------------------
function Untitled_5_Callback(hObject, eventdata, handles)
% hObject handle to Untitled_5 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
CHAPTER – 10
TESTING
In general, software engineers distinguish software faults from software failures. In case
of a failure, the software does not do what the user expects. A fault is a programming error that
may or may not actually manifest as a failure. A fault can also be described as an error in the
correctness of the semantic of a computer program. A fault will become a failure if the exact
computation conditions are met, one of them being that the faulty portion of computer software
executes on the CPU. A fault can also turn into a failure when the software is ported to a
different hardware platform or a different compiler, or when the software gets extended.
Software testing is the technical investigation of the product under test to provide stakeholders
with quality related information.
Software testing may be viewed as a sub-field of Software Quality Assurance but
typically exists independently (and there may be no SQA areas in some companies). In SQA,
software process specialists and auditors take a broader view on software and its development.
They examine and change the software engineering process itself to reduce the amount of faults
that end up in the code or deliver faster.
Regardless of the methods used or level of formality involved the desired result of
testing is a level of confidence in the software so that the organization is confident that the
software has an acceptable defect rate. What constitutes an acceptable defect rate depends on
the nature of the software. An arcade video game designed to simulate flying an airplane would
presumably have a much higher tolerance for defects than software used to control an actual
airliner.
A problem with software testing is that the number of defects in a software product can
be very large, and the number of configurations of the product larger still. Bugs that occur
infrequently are difficult to find in testing. A rule of thumb is that a system that is expected to
function without faults for a certain length of time must have already been tested for at least
that length of time. This has severe consequences for projects to write long-lived reliable
software.
A common practice of software testing is that it is performed by an independent group
of testers after the functionality is developed but before it is shipped to the customer. This
practice often results in the testing phase being used as project buffer to compensate for project
delays. Another practice is to start software testing at the same moment the project starts and
it is a continuous process until the project finishes.
Another common practice is for test suites to be developed during technical support escalation
procedures. Such tests are then maintained in regression testing suites to ensure that future
updates to the software don't repeat any of the known mistakes.
Software Testing is the process used to help identify the correctness, completeness,
security, and quality of developed computer software. Testing is a process of technical
investigation, performed on behalf of stakeholders, that is intended to reveal quality-related
information about the product with respect to the context in which it is intended to operate.
This includes, but is not limited to, the process of executing a program or application with the
intent of finding errors. Quality is not an absolute; it is value to some person. With that in mind,
testing can never completely establish the correctness of arbitrary computer software; testing
furnishes a criticism or comparison that compares the state and behavior of the product against
a specification. An important point is that software testing should be distinguished from the
separate discipline of Software Quality Assurance (SQA), which encompasses all business
process areas, not just testing.
There are many approaches to software testing, but effective testing of complex products is
essentially a process of investigation, not merely a matter of creating and following routine
procedure. One definition of testing is "the process of questioning a product in order to evaluate
it", where the "questions" are operations the tester attempts to execute with the product, and
the product answers with its behavior in reaction to the probing of the tester[citation needed].
Although most of the intellectual processes of testing are nearly identical to that of review or
inspection, the word testing is connoted to mean the dynamic analysis of the product—putting
the product through its paces. Some of the common quality attributes include capability,
reliability, efficiency, portability, maintainability, compatibility and usability. A good test is
sometimes described as one which reveals an error; however, more recent thinking suggests
that a good test is one which reveals information of interest to someone who matters within the
project community.
Testing Methodologies: -
Black Box Testing
It is the testing process in which tester can perform testing on an application without
having any internal structural knowledge of application. Usually Test Engineers are involved
in the black box testing.
White Box Testing
It is the testing process in which tester can perform testing on an application with having
internal structural knowledge. Usually The Developers are involved in white box testing.
Gray Box Testing
It is the process in which the combination of black box and white box tonics’ are used.
Types of Testing
1. Regression Testing.
2. Re-Testing.
3. Static Testing.
4. Dynamic Testing.
5. Alpha Testing.
6. Beta Testing.
7. Monkey Testing
8. Compatibility Testing.
9. Installation Testing.

1. Regression Testing: It is one of the best and important testing. Regression testing is the
process in which the functionality, which is already tested before, is once again tested
whenever some new change is added in order to check whether the existing functionality
remains same.
2. Re-Testing: It is the process in which testing is performed on some functionality which
is already tested before to make sure that the defects are reproducible and to rule out the
environment’s issues if at all any defects are there.
3. Static Testing: It is the testing, which is performed on an application when it is not been
executed.
Ex: GUI, Document Testing
4. Dynamic Testing: It is the testing which is performed on an application when it is being
executed. ex: Functional testing.
5. Alpha Testing: It is a type of user acceptance testing, which is conducted on an
application when it is just before released to the customer.
6. Beta-Testing: It is a type of UAT that is conducted on an application when it is released
to the customer, when deployed in to the real time environment and being accessed by the
real time users.
7. Compatibility testing: It is the testing process in which usually the products are tested
on the environments with different combinations of databases (application servers,
browsers…etc.) In order to check how far the product is compatible with all these
environments platform combination.
8. Installation Testing: It is the process of testing in which the tester try to install or try to
deploy the module into the corresponding environment by following the guidelines
produced in the deployment document and check whether the installation is successful or
not.

Test cases

Title Execution steps Expected output


CHAPTER – 11
SCREENSHOTS
USER LOGIN

Taking picture from camera or browser


NAMING THE PICTURE

TRAINING THE SYSTEM WITH THE IMAGE


COMPLETION OF TRAINING

MATCHING

IF IMAGE IS MATCHED
IF IMAGE IS NOT MATCHED

IMAGES IN THE DATABASE

ATTENDACE IN THE DATABASE


A
CHAPTER – 12
FUTURE ENHANCEMENT
SCOPE FOR FUTURE WORK:

1. Currently, the system has reached the accuracy level up to 80% for partial and dense

images. It can further be improved to obtain higher accuracy levels.

2. Further, two or more IP cameras can be employed and each image can be processed

separately. The results of these can be merged to obtain better results and accuracy in denser

classrooms.
CHAPTER – 13
BIBLIOGRAPHY
 http://ieeexplore.ieee.org/xpl/articleDetails.jsp?tp=&arnumber=4153394&queryText

%3Dface+detection

 http://ieeexplore.ieee.org/xpl/articleDetails.jsp?tp=&arnumber=5274642&queryText

%3Dface+detection
Advantages:

 The software can be used for security purposes in organizations and in secured zones.

 The software stores the faces that are detected and automatically marks attendance.

 The system is convenient and secure for the users.

 It saves their time and efforts.

Disadvantages:

 The system don’t recognize properly in poor light so may give false results.

 It can only detect face from a limited distance.

Vous aimerez peut-être aussi