Vous êtes sur la page 1sur 61

1

I. COMPANY PROFILE

Qubis Infoway, premier software solution providers in India. In brief stint grown and
developed many customized solutions across various domains/verticals. The team works
cooperatively to produce success for clients. Gaining their trust and building a long-lasting
affiliation. This tie up helps to give in best and generate good results and accomplishing the
goals in a well premeditated structure. Qubis Infoway solutions not only productive but also
have valued affordably for all clients ranging from start ups, small business to leading
corporations. The team of experts provides a variety of best services on Sustainability, Focus,
Dedication, and Cost Effectiveness. It explore into all the major facts of corporate
information technology.
Qubis Infoway is a custom software development and Solutions Company placed at
Palakkad, India. Qubis Infoway possesses an experience in providing complex and diverse
enterprise software development solutions to a large range of clients.
Qubis Infoway has the exclusive experience in the software development stature,
which takes pleasure of the customer retention rate. Qubis Infoway excels principled
technology for the services promising outstanding business exposure for its clients. Qubis
Infoway on providing services that influence evolving efficient business model for the
exclusive liberation.
Associated with a highly skilled team are tweaking exclusive development and
execution procedure to its benefited clients. Moreover, this is enabling for the time bound
delivery of the challenging solutions with confidence.








2
II. INTRODUCTION
About the project
This project is boon for the deaf and hearing impaired people who use Indian Sign
Language for communication. This project translates spoken English into animated graphical
representation. This software displays Indian Sign Language (ISL) signs on a computer
screen, is most effective which sort questions and brief instructions. This software is also
useful for blind people by providing correct pronunciations of data.
The goal of the software is to translate English in to short, predictable communication
between English speaker and listener. This application enables the user to search for a phrase
and display the animated, graphical image of related phrase. At the same time it will also
display the textual form on the screen. The user testing shows that the software, which uses a
human looking model, is a workable solution for ISL speaker. This application work best if
the user is working in a restricted domain.

System process
There are mainly two modules in this project
Administrative module
User modules

Module 1: Administrative options
Design
The designing module includes the frame design, data and information collection for
the entire system. It also includes the application design where the input will be given by the
user.

Learning the Sign Language
The learning module includes the training phase for deaf mutes as well as the persons
involved with the real life activities. By showing the images and movies related to the phrase,
the person with hearing defect will be able to understand the signs. Also the signs shown by
the person will be matched with the inputted image.


3
Dictionary Formation
The dictionary formation is designing of the specific set of words for which the
training and language generation is made. It contains the phrases and keywords to
communicate with the project.

Lip Synchronization
The lip synchronization includes the lip component to speak the text inputted. The
input will be feed to textbox or any other input medium. Then it will be outputted via the
speaker.

Integration
The integration of dictionary with sign language is done in this module. Every
modules discussed in previous will be integrated and tested in this phase. Input and output
can be tested at here, and the final verification with speech companion is also taken place at
here.

Module 2: User options
In this module, user can search for a phrase or search for how to pronounce the phrase
or search for still images and the appropriate result will be displayed on the screen. The user
can also choose presentation option to display power point presentation. There is also option
for taking still images.










4
III. SYSTEM ANALYSIS

System analysis is a detailed study of various operations performed by the system and
their relationship within and outside the system. The success of the system depends largely
on how clearly the project is defined, thoroughly investigated and properly carried out
through the choice of the solution. A good analysis model should provide not only the
mechanism of problems understanding but also the framework of its solution. Thus it should
enable the smooth transition to design the model.
The System Analysis involves the identification of problems, objectives and
requirements, evaluation of alternate solutions and recommendation for a more feasible
solution. In other words, system analysis is a step by step process of gathering, recording and
interpreting facts. It also includes studying the problems encountered in the present system,
if any, and introducing a new computing system into an organization using the information to
recommend improvement to suit the organization and its functions.
System analysis begins when a user or manager requests a study of programs in either
on an existing system or a projected one. It involves studying the base of an organizations
current operation and process data to produce information with the goal of determining how
to make it better. System analysis itself breaks into two stages: preliminary and detailed.
During preliminary analysis, determining the objectives of the system and arrives at a
preliminary report by the management. Then system analysis proceeds to the second stage,
the detailed analysis.
During detailed analysis, the analyst gathers facts about the old system, outline
objectives for a new one, estimating cost, listing possible alternatives and making
recommendation. After gathering many facts and figures, analyst will arrange it in an
organized way call final reports. Management evaluates the analysts final report and decides
whether to halt the project or proceed with it.







5
System Requirements

To be used efficiently, all computer software needs certain hardware components or
other software resources to be present on a computer. These prerequisites are known as
(computer) system requirements and are often used as a guideline as opposed to an absolute
rule. Most software defines two sets of system requirements: minimum and recommended.
With increasing demand for higher processing power and resources in newer versions of
software, system requirements tend to increase over time. Industry analysts suggest that this
trend plays a bigger part in driving upgrades to existing computer systems than technological
advancements.
An operating system is the part of a computer manages the way different programs
use its hardware, and regulates the ways that a user controls the computer. Operating systems
are found on almost any device that contains a computer with multiple programsfrom
cellular phones and video game consoles to supercomputers and web servers, and even
automobiles. Some popular modern operating systems for personal computers include
Microsoft Windows, Mac OS X, and Linux.
The project Audio visual process for deaf mutes has the advantage that it can operate
in any operating system and also can monitor the system irrespective of the operating system
of the client system. The operating system acts as an interface between an application and the
hardware. The user interacts with the hardware from "the other side". This includes
everything from simple communication, to using networked file systems or even sharing
another computer's graphics or sound hardware. Some network services allow the resources
of a computer to be accessed transparently
The Functional requirement of this software is developed for using in organizations
whose offices and departments spread across huge building or different buildings connected
by LAN. It requires the following functions.

Server program
Server program runs in a single machine. Every client systems must be connected
with the server through the port .A port is a numbered socket. The server program calls an
instance of the chat handler whenever a chat request is received. The server sends active user
names to all current systems as well as remove usernames from the active list whenever the
users closes the application.
6


User interface
It displays a frame with a list containing clients name according to the name sever can
performs the activities like and data transfer and so on.

External Interface Requirements:
Network
A Network is a collection of computers and other devices that can move information
around from one device to another. Networking extends the power of single computer.
Networks let one computer communicate with thousands of other computers.
Network programming means different things to different people. To some network
programming means low level programming: reading and writing network sockets,
translating network protocols, encrypting/decrypting data and so on. To others, the meaning
is much broader and includes the design and programming of distributed applications. Some
programmers interpret network programming as internet/intranet programming.
The defining characteristic on network programs is that they use a network in some
way to do their work. Network does any, some or all of the following: send data, provide
services, receive data, and invoke services. In the simplest case, network programs either
send data or they receive data.

Processor
A Processor is a type of player. A Processor has control over what processing is
performed on the input media stream. In addition to rendering a data source, a processor can
also output media data through a data source so it can be presented by another player or
processor, further processed by another processor, or converted to some other format.

Existing method:
Every users uses internet to get all required information but we have to search each
and every information individually. Various systems are available that facilitate real time
communication over the Intranet. Already we use Google to display the contents in a detailed.
The existing does not have the facility to chat, client system real-time image recording and
7
file sharing among the system .The drawbacks of the existing system are Inefficient, Search
Complexity & Lack of Information.

Proposed method:
The system will not for the foreseeable future replace human signers. This application
work best if user is working in a restricted domain. The system envisions the program working
best in limited, predictable interaction. The System uses a custom-written user interface,
commercially available 3D- graphics software for the human-looking model and connections
to a speech recognition application programming interface and speech engine that stores the
translation information. The proposed system allows, in addition to facilitating sharing of files
and desktop sharing among the participants. The proposed system allows the facility to voice
chat over the network. And it also allows recording the images of the client screen in the
Administrator hard disk. The user facilitating sharing of files and desktop sharing among the
participants of the lab when the others in the lab will gives the security permeation. This
system provides a user friendly environment. This software is easy to use, easy to install and
implement and provide pictorial representation and views

Software and Hardware Requirements

Software Requirements
The software requirement specification (SRS) forms the basis of software
development. The purpose of software requirement specification is to improve
understandability and needs of the software. This provides a validation of the final product
and its future development.


Operating System : Windows, Linux
Front End : .Net
Back End : SQL SERVER

Hardware Requirements

To implement a new system the choice of a processor with a maximum possible speed
is made. There should be sufficient memory to store data and software tools for efficient
8
processing. A keyboard as well as a Hub or switch is very much necessary for entering data
and establishing network connection between machines respectively.

Processor : Pentium -III (or) above
Speed : Minimum 1 GHz or above
Hard disk capacity : 40GB
RAM capacity : 256MB RAM
CD-ROM drive : 52x speed
Keyboard : 104 keys
Mouse : Logitech
Monitor : 15 Monitor


Background Study
The background study involves detailed study of the system, which enables to know
about the functions and operations performed within the system and also their relationship
outside the system. It mainly emphasize on the drawbacks of the existing system and the need
of the proposed system. It is the process of gathering and interpreting facts, diagnosing
problems and using the information to recommend improvements on the system. Systems
analysis is a problem solving activity that requires intensive communication between the
system users and system developers.
It also involves studying the way an organization currently retrieves the process with
a view to make it work together. Defining and understanding the process is an important part
of system analysis. A detailed study of these processes must be made by the various
techniques. The data collected by these techniques must be scrutinized to arrive to a
conclusion. The conclusion is an understanding of how the system functions. This system is
called the existing system. The various techniques used for this purposes is known as fact
finding techniques.

The various techniques used in background study are:
Observation
Documentation
Discussions

9
The objective of this system study is to determine whether there is any need for the
new system. All the levels of the feasibility measures have to be performed. Thereby
knowing the performance by which a new system has to be performed. Background Study is
fully based on the System Development Strategies.

Problem Identification
Problem identification involves studying the ways in which the organization currently
process data and produce information with goals. Various systems are available that facilitate
real time communication over the Intranet. Open source Audio visual process for deaf mutes
are not presently available. The existing does not have the facility to chat, client system real-
time image recording and file sharing among the system. At present the users are also
suffering from these problems

Time Consuming
The transfer of client screen is synchronous. The transfer of client screen depends on
modem speed. It varies from city to town also to village depending on the infrastructure
available there.

Inefficient
The system is inefficient as it depend on the infrastructure. So a huge amount of
money has to be spending in it.

Solution to the problem
This software is an open source that facilitates real time communication on a LAN.
The proposed system allows to facilitating sharing of files and desktop sharing among the
participants of the lab. Various systems are available that facilitate real time communication
over the Intranet. But there is no such a system for real time monitoring on the cross
platform that is there is no system to monitor the systems in the lab within the different
operating system. And the proposed system allows the facility to voice chat over the network.
And it also allows recording the images of the client screen in the Administrator hard disk.
The user facilitating sharing of files and desktop sharing among the participants of the lab
when the others in the lab will gives the security permeation.

Features of Proposed System

10
Greater Security
More Functionalities For Admin
Lower Infrastructure Costs.
Faster Communication Between Machines
Multi System Access
Reverse Connection To User When The Others Allows The Permeation.
A Well Simplified Admin Console
A Nice Voice Chat over the Network.
A Client Screen Image Record on the Admin.

Identification of Need
From the analysis, design requirements are formulated. The requirements for the new
system are those features that must be incorporated to produce the improvements. These
requirements are determined by comparing current performance with the objectives for
acceptable system performance. The new system should have the following properties.
Optimized screen transfer
A reverse connection
Less resource occupation (CPU, memory and bandwidth)
May acts as a Windows service or a UNIX daemon
Embedded HTTP server
A new Graphical user interface
Password generation
Proxy support
To achieve these features, several alternatives must be studied and evaluated. One alternative
may not satisfy the entire features .The analyst then selects those that are feasible
economically, technically and operationally.

Feasibility Study
A feasibility study is an evaluation of a proposal designed to determine the difficulty
in carrying out a designated task. Generally, a feasibility study precedes technical
development and project implementation. Feasibility Study is performed to choose the system
that meets the performance requirements at least cost. The most difficult part of a Feasibility
Study is the identification of the candidate systems and the evaluation of their performances
11
and costs. The new system has no additional expense to implement the system. The new
system has advantages such as we can easily access files from any client, accurate output for
accurate input and this application is more user friendly. We can use this application not only
in this organization but also in other firms. So it is worth solving the problem.
The major considerations involved in the feasibility study are the following:
Economic Feasibility
Operational Feasibility
Technical Feasibility
Behavioral Feasibility
Economic Feasibility
It is commonly known as cost or benefit analysis. Economical Feasibility answers the
question whether the cost and timescales are right for the application and whether the
potential returns will justify the initial outlay. It is the most frequently used method for
evaluating the effectiveness of the candidate system.
Economic Feasibility includes an assessment of the one-time cost of hardware and
software and it also addresses the impact of the final system on the overall performance of the
business. The justification for the new system is it will increase the profit of the enterprise,
improve the quality of services or products, reduce expenditure or otherwise contribute
towards attaining goals of the enterprise.
The proposed system is a very cost effective one. The candidate system can be
developed at a reasonable cost with the available hardware and software. No need for extra
hardware and software for implementing the current project and the cost of other resources
needed for the development. It saves lots of money when compared to the already existing
system. As it is developed for Academic project, the cost of the company is nil compared to
professional development.
Technical Feasibility
Technical Feasibility centers on the hardware and software of the candidate system
and to what extend it can support the proposed system. The assessment of technical
feasibility is based on system design ideas relating to what can be accomplished with existing
12
technology. This involves financial considerations to accommodate technical enhancements.
If the budget is a serious constraint, the project is judged not feasible.
This feasibility checks whether the technology is available to develop the system. It
is a study of function, performance, and constraints that may affect the ability to achieve an
existing system. So this project is technically feasible. We should be extremely careful in
selection of the software platform and the tools for development.
Technical Feasibility study is performed to check whether the proposed system is
technically feasible or not. Technical feasibility centers on the existing computer system
(hardware, software, etc) and to what extent it can support the proposed addition. This
involves financial consideration to accommodate technical enhancement. This system is
technically feasible. All the data are stored in files. The input can be done through dialog
boxes which are both interactive and user friendly. Hard copies can be obtained for future
use, by diverting the documents to a printer. Windows serves as the platform for the new
system.
Behavioral Feasibility
Behavioral Feasibility deals with how people accept the new system. People are often
resistant to changes and computers have known to make change. An estimate must be made
of how strong the reaction of the user is likely to have towards the development of a
computerized system. The assessment behavioral/social feasibility is assuming greater
importance now days. Because working in unacceptable environment, it makes the
production lesser and low in potential.
The system has a graphical user interface which makes it user friendly. Also the
development of the project is done with the requirements given for users convenience and
reduces their paper works and other time consuming processes. User training can be done
easily and effectively.
Behavioral feasibility is an evolution of the probability that the company is
sufficiently motivated to support the development of the implementation of the application
with the necessary use of participation resources learning etc.
13
The interest and support shown by the user organization during system study do not
seem to reflect any possible resistance in this regard. So from behavioral aspects the new
system is supposed to have efficient support from the company.
Operational feasibility
Operational Feasibility study is performed to check whether the system is
operationally feasible or not. It is a measure of how a proposed system solves the problems,
and takes the advantages of the opportunities identified during scope definition and how it
satisfies the requirements identified in the requirement analysis phase of system development.
Using command buttons throughout the application programs enhances operational
feasibility. So maintenance and modification is found to be easier. People are inherently
resistant to changes and need sufficient amount of training, which would result in lot of
expenditure, which is an additional expenditure for the organization. Here the proposed
system is beneficial because it can be turned into an information system that will meet the
organizations operating requirements. Today there wont be anyone who is not trained to use
computer and Internet. The proposed system is very user friendly. It does not impose much
need of training the users. So, the system can be judged as operationally feasible.

Resource Feasibility and Time Feasibility
This involves questions such as how much time is available to build the new system,
when it can be built, whether it interferes with normal business operations, type and amount
of resources required, dependencies, etc. Contingency and mitigation plans should also be
stated here. Time Feasibility involves determining whether a proposed project can be
implemented fully within stipulated time frame. If a project takes too much time it is likely to
be rejected.
This system takes less time to display the output. So this project is said to be
technically feasible and time feasible.


14
IV.SYSTEM DESIN AND DEVELOPMENT
The most creative and challenging phase of the life cycle is system design. The term
design describes a final system and the process by which it is developed. It refers to the
technical specifications that will be applied in implementations of the candidate system. The
design may be defined as the process of applying various techniques and principles for the
purpose of defining a device, a process or a system with sufficient details to permit its
physical realization.
The designers goal is how the output is to be produced and in what format. Samples
of the output and input are also presented. Second input data and database files have to be
designed to meet the requirements of the proposed output. The processing phases are
handled through the program Construction and Testing. Finally, details related to
justification of the system and an estimate of the impact of the candidate system on the user
and the organization are documented and evaluated by management as a step toward
implementation.
The importance of software design can be stated in a single word Quality. Design
provides us with representations of software that can be assessed for quality. Design is the
only way where we can accurately translate a customers requirements into a complete
software product or system. Without design we risk building an unstable system that might
fail if small changes are made. It may as well be difficult to test, or could be one whos
quality cant be tested. So it is an essential phase in the development of a software product.
Design process
The design phase focuses on the detailed implementation of the system recommended in
the feasibility study. The design phase is a transition from a user-oriented document to
document oriented to the programmers or database personnel. System design goes through to
phase of development:
Logical Design
Physical Design

The dataflow diagram shows the logical flow of the system and defines the
boundaries of the system. For a candidate system, it describe the inputs(source),
output(destination), database(file) and procedures(dataflow), all in a format that meets the
users requirement in logical design we specifies the users needs at a level of detail that
15
virtually determines the information flow into and out of the system and the required data
resources.
Following logical design is physical design. This produces the working system by
defining specification that tell programmers exactly what the candidate system must do, in
turn we write the necessary programs or modifies the software package that accept input from
the user, then perform the necessary operation through logical system design is one important
phase of system design, the dataflow diagram is the logical flow of a system and defines the
boundaries of the system, for a candidate system it describes the inputs or source, outputs or
destination, database or data stores and procedures all in a format that meets the user needs.
When analysts prepare the logical system design, they specify the user needs at level of detail
that virtually. It determines the information flow into and out of the system and the required
data resources. The logical system design covers: Reviews the current physical system its
dataflow, file content, volumes, frequency etc.
Preparing output specification that determines the format, content and frequency of
reports including terminal specification and location. Prepares input specification format,
content and the most if the input functions. This includes determining the flow of the
document from the input data source to the detailed output location. Prepares edit security
and control specification , this includes specifying the rules for edit correction backup
procedures and the controls that ensure processing file integrity specifies the implementation
plan.
Prepare the logical design walks through the information flow output, input and
controls and implementation plan reviews benefits, costs, target rates and system constraints
the existing file and procedure reports.

About the tools
.NET is the framework for which we develop applications. It sits in between our
application programs and operating system. Applications developed for .NET run inside
.NET and are controlled by .NET. It supports both Windows and web applications.
.NET provides an object oriented environment. It ensures safe execution of the code
by performing required runtime validations. For example, it is never possible to access an
element of an array outside the boundary. Similarly, it is not possible to a program to write
into another programs area, etc. The runtime validations performed by .NET makes the entire
environment robust.
16
Components of .NET
.NET framework has two main components. They are:
Common Language Runtime.
.NET class library.

Common Language Runtime
The Common Language Runtime (CLR) is the environment where all programs in
.NET are run. It provides various services, like memory management and thread
management. Programs that run in the CLR need not manage memory, as it is completely
taken care of by the CLR. For example, when a program needs a block of memory, CLR
provides the block and releases the block when program is done with the block.
All programs targeted to .NET are converted to MSIL (Microsoft Intermediate Language).
MSIL is the output of language compilers in .NET. MSIL is then converted to native code by
JIT (Just-in Time Compiler) of the CLR and then native code is run by CLR.
As every program is ultimately converted to MSIL in .NET, the choice of language is
pure personal. A program written in VB.NET and a program written in C# are both
converted to MSIL. Then MSIL is converted to native code and run. So, whether you write
program in C# or VB.NET at the end it is MSIL all that you get.
It is believed VB6.0 programmers will migrate to VB.NET and C++ and Java programmers
switching to .NET will prefer to use C# as it more resembles those languages.
For Java programmers, MSIL in .NET is same as Byte code in concept. CLR is same
as JVM (Java virtual machine). So the inevitable question is? Is .NET platform independent
like Java?? The answer is technically YES. A program written for .NET can run on any
platform as long as .NET is made available on that platform. As of now, .NET runs only on
Windows. So, .NET is technically platform independent but not real, at least not now. Efforts
are on to make .NET run on Linux. The project is called as Mono and is currently being
developed. Though some people doubt the seriousness of Microsoft, I will not be surprised if
Microsoft comes out with .NET on Linux in future. If that happens, all your VB.NET, and
C# programs can run as they are on both Windows and Linux. Who knows what more in
store?
The code that is run under CLR is called Managed code.


17
.NET Class Library
.NET comes with thousands of classes to perform all important and not-so-important
operations. Its library is completely object oriented, providing around 5000 classes to
perform just about everything.
The following are the main areas that are covered by Class library.
Data Structures
IO management
Windows and Web Controls
Database access
Multithreading
Remoting
Reflections
The above list is comprehensive and only to provide you an instant idea regarding how
comprehensive the library is. The most fascinating part of .NET is the class library; it's
common to all language of .NET. That means the way you access files in VB.NET will be
exactly same in C#, and in fact all other languages of .NET. You learn library only for once,
but use it in every language.
Also the library is common for all types of applications. The following are different types of
applications that can make use of .NET class library.
Console applications.
Windows GUI applications.
ASP.NET applications.
XML Web services.
Windows services.
So, you can leverage your knowledge of library irrespective of language and type of
application you are developing. I think it is the best thing that can happen to programmers.
Imagine moving from COBOL to C and then from C to VB. You learned how to perform
common operations three times because those three languages didnt have any function in
common.



18
Features of .NET
The following are major features of .NET. We will use these features throughout our
journey. Here is just a brief introduction to all key features of .NET.
Assemblies
An assembly is either a .DLL or .EXE that forms a part of an application. It contains MSIL
code that is executed by CLR. The following are other important points related to an
assembly:
It is the unit on which permissions are granted.
Every assembly contains a version
Assemblies contain interfaces and classes. They may also contain other resources
such as bitmaps, file etc.
Every assembly contains assembly metadata, which contains information about
assembly. CLR uses this information at the time of executing assembly.
Assemblies may be either private, which are used only by the application to which
they belong or Global assemblies, which are used by any application in the system.
Two assemblies of the same name but with different versions can run side-by-side
allowing applications that depend on a specific version to use assembly of that
version.
The four parts of an assembly are:
Assembly Manifest - Contains name, version, culture, and information about referenced
assemblies.
Type metadata - Contains information about types defined in the assembly.
Resources - Files such as BMP or JPG file or any other files required by application.

Visual Studio.NET
VS.NET is the application development tool to develop applications for .NET. It
supports development of all types of applications that .NET supports. It also provides support
for VB.NET, C#, Visual C++.Net and Visual J# languages. VS.NET is a single environment
that provides all tools required to develop and debug applications.
The following are key features of Visual Studio.NET.


19
Languages supported
VS.NET supports application development using the language of your choice. It also
allows mixed language solutions.
Intelligence
Intelligence provides options that make programming in VS.NET easier than ever
before.

The following are important functions of intelligence.
Allows you to see the syntax of the method that you are calling
Completes the variable, command, or function name once you have entered enough
characters to disambiguate the term.
Displays the list of valid members for the class, structure or namespace you type so
that you can select one form the list. It places the selected member in your code.
Automatically brace matching allows whether ending brace is given after opening
brace is given.

C#.NET
C# (pronounced "C sharp") is a simple, modern, object-oriented, and type-safe
programming language. It will immediately be familiar to C and C++ programmers. C#
combines the high productivity of Rapid Application Development (RAD) languages and the
raw power of C++.
Visual C# .NET is Microsoft's C# development tool. It includes an interactive
development environment, visual designers for building Windows and Web applications, a
compiler, and a debugger. Visual C# .NET is part of a suite of products, called Visual
Studio .NET, that also includes Visual Basic .NET, Visual C++ .NET, and the JScript
scripting language. All of these languages provide access to the Microsoft .NET Framework,
which includes a common execution engine and a rich class library. The .NET Framework
defines a "Common Language Specification" (CLS), a sort of lingua franca that ensures
seamless interoperability between CLS-compliant languages and class libraries. For C#
developers, this means that even though C# is a new language, it has complete access to the
same rich class libraries that are used by seasoned tools such as Visual Basic .NET and
Visual C++ .NET. C# itself does not include a class library.
20
As an object-oriented language, C# supports the concepts of encapsulation,
inheritance, and polymorphism. All variables and methods, including the Main method, the
application's entry point, are encapsulated within class definitions. A class may inherit
directly from one parent class, but it may implement any number of interfaces. Methods that
override virtual methods in a parent class require the override keyword as a way to avoid
accidental redefinition. In C#, a struct is like a lightweight class; it is a stack-allocated type
that can implement interfaces but does not support inheritance.
In addition to these basic object-oriented principles, C# makes it easy to develop software
components through several innovative language constructs, including the following:

Encapsulated method signatures called delegates, which enable type-safe event
notifications.
Properties, which serve as accessors for private member variables.
Attributes, which provide declarative metadata about types at run time.
Inline XML documentation comments.
Language-Integrated Query (LINQ) which provides built-in query capabilities across
a variety of data sources.

.NET Framework Platform Architecture
C# programs run on the .NET Framework, an integral component of Windows that
includes a virtual execution system called the common language runtime (CLR) and a unified
set of class libraries. The CLR is the commercial implementation by Microsoft of the
common language infrastructure (CLI), an international standard that is the basis for creating
execution and development environments in which languages and libraries work together
seamlessly.
Source code written in C# is compiled into an intermediate language (IL) that
conforms to the CLI specification. The IL code and resources, such as bitmaps and strings,
are stored on disk in an executable file called an assembly, typically with an extension of .exe
or .dll. An assembly contains a manifest that provides information about the assembly's types,
version, culture, and security requirements.
Language interoperability is a key feature of the .NET Framework. Because the IL
code produced by the C# compiler conforms to the Common Type Specification (CTS), IL
code generated from C# can interact with code that was generated from the .NET versions of
Visual Basic, Visual C++, or any of more than 20 other CTS-compliant languages. A single
21
assembly may contain multiple modules written in different .NET languages, and the types
can reference each other just as if they were written in the same language.
In addition to the run time services, the .NET Framework also includes an extensive
library of over 4000 classes organized into namespaces that provide a wide variety of useful
functionality for everything from file input and output to string manipulation to XML
parsing, to Windows Forms controls. The typical C# application uses the .NET Framework
class library extensively to handle common "plumbing" chores.

ADO.NET Data Architecture
Data Access in ADO.NET relies on two components: Dataset and Data Provider.

Dataset
The dataset is a disconnected, in-memory representation of data. It can be considered
as a local copy of the relevant portions of the database. The Dataset is persisted in memory
and the data in it can be manipulated and updated independent of the database. When the use
of this Dataset is finished, changes can be made back to the central database for updating.
The data in Dataset can be loaded from any valid data source like Microsoft SQL server
database, an Oracle database or from a Microsoft Access database.

Data Provider
The Data Provider is responsible for providing and maintaining the connection to the
database. A Data Provider is a set of related components that work together to provide data in
an efficient and performance driven manner. The .NET Framework currently comes with two
Data Providers: the SQL Data Provider which is designed only to work with Microsoft's SQL
Server 7.0 or later and the Olden Data Provider which allows us to connect to other types of
databases like Access and Oracle. . Each Data Provider consists of the following
component classes:
The Connection object which provides a connection to the database.
The Command object which is used to execute a command.
The Data Reader object which provides a forward-only, read only, connected record
set.
The Data Adapter object which populates a disconnected Dataset with data and
performs update.
22

Data access with ADO.NET can be summarized as follows:
A connection object establishes the connection for the application with the database.
The command object provides direct execution of the command to the database. If the
command returns more than a single value, the command object returns a Data Reader to
provide the data. Alternatively, the Data Adapter can be used to fill the Dataset object. The
database can be updated using the command object or the Data Adapter.

Component classes that make up the Data Providers

The Connection Object
The Connection object creates the connection to the database. Microsoft Visual Studio
.NET provides two types of Connection classes: the SQL Connection object, which is
designed specifically to connect to Microsoft SQL Server 7.0 or later, and the OleDb
Connection object, which can provide connections to a wide range of database types like
Microsoft Access and Oracle. The Connection object contains all of the information required
to open a connection to the database.

The Command Object
The Command object is represented by two corresponding classes: SQL Command and
OleDb Command. Command objects are used to execute commands to a database across a
data connection. The Command objects can be used to execute stored procedures on the
database, SQL commands, or return complete tables directly. Command objects provide three
methods that are used to execute commands on the database:
Execute NonQuery: Executes commands that have no return values
Execute Scalar: Returns a single value from a database query
Execute Reader: Returns a result set by way of a Data Reader object






23
The Data Reader Object
The Data Reader object provides a forward-only, read-only, connected stream record
set from a database. Unlike other components of the Data Provider, Data Reader objects
cannot be directly instantiated. Rather, the Data Reader is returned as the result of the
Command object's Execute Reader method. The SQL Command execute Reader method
returns a SQL Data Reader object, and the OleDb Command. Execute Reader method returns
an OleDb Data Reader object. The Data Reader can provide rows of data directly to
application logic when you do not need to keep the data cached in memory. Because only one
row is in memory at a time, the Data Reader provides the lowest overhead in terms of system
performance but requires the exclusive use of an open Connection object for the lifetime of
the Data Reader.

The Data Adapter Object
The Data Adapter is the class at the core of ADO .NET's disconnected data access. It
is essentially the middleman facilitating all communication between the database and a Data
Set. The Data Adapter is used either to fill a Data Table or Data Set with data from the
database with its Fill method. After the memory-resident data has been manipulated, the Data
Adapter can commit the changes to the database by calling the Update method.
The Data Adapter provides four properties that represent database commands:
Select Command
Insert Command
Delete Command
Update Command
When the Update method is called, changes in the Data Set are copied back to the database
and the appropriate Insert Command, Delete Command, or Update Command is executed.







24
About SQL Server

SQL Server Full Text Search: Language Features
SQL Full-text Search is an optional component of SQL Server 7 and later, which
allows fast and efficient querying when you have large amounts of unstructured data. This is
the first of a two-part article that explores the full-text language features that ship with SQL
Server versions 7, 2000, and 2005, with particular focus on the new language features in SQL
2005.
Here, in part 1, we examine the index time language options: how words or tokens are broken
out and stored in a full text index. In part 2, we will switch focus to the query time language
options.

SQL FTS architecture
SQL FTS builds a full-text index through a process called population, which fills the
index with words and the locations in which they occur in your tables and rows. The full text
indexes are stored in catalogs. You can define multiple catalogs per database, but the catalog
cannot span databases; it is database specific. Similarly a table can be full-text indexed in a
single catalog; or to put it another way - a table's full-text index cannot span catalogs.
However, in SQL 2005 you can full-text index views which may reside in different catalogs
than the underlying base tables. This provides performance benefits and allows partitioning
of tables.
There is a useful depiction of the SQL FTS architecture in BOL. I won't repeat it here,
but I do encourage you to familiarize yourself with it, and how the various FTS components
interact. Very briefly, during the population process the indexing engine (MS Search in SQL
7 & 2000 and MSFTESQL in SQL 2005) connects to SQL Server using an OLE-DB provider
(PKM-Publishing and Knowledge Management) and extracts the textual content from your
table on a row by row basis. MS Search uses the services of COM components called iFilters
to extract a text stream from the columns you are indexing.




25
SQL FTS Overview
In essence, in order to perform full text searching on a table you need to:
1. Ensure that the table has a unique, not null column (e.g. primary key)
2. Create a full text catalog in which to store full text indexes for a given table
3. Create a full text index on the text column of interest.
It is possible to build full-text indexes on textual data stored in CHAR, NCHAR,
VARCHAR, NVARCHAR, TEXT, NTEXT columns as we as on IMAGE (SQL 200x),
VARBINARY(MAX) (SQL 2005), and XML (SQL 2005) data type columns that contain
formatted binary data

iFilters
The iFilter used depends on the data type of the column in which the data is stored and on
how you have configured your full-text index:
For columns of the CHAR, NCHAR, VARCHAR, NVARCHAR, TEXT, and
NTEXT data types the indexing engine applies the text iFilter. You can't override this
iFilter.
For the columns of the XML data type the indexing engine applies the XML iFilter.
You can't override the use of this iFilter.
For columns of the IMAGE, and VARBINARY data type, the indexing engine applies the
iFilter that corresponds to the document extension this document would have if stored in the
file system (i.e. for a Word document, this extension would be doc, for an Excel Spreadsheet
this would be xls).
Indexing Text
The sophisticated language features of full-text search then allow you to perform a
range of advanced searches on your textual data, using the CONTAINS and FREETEXT T-
SQL predicates (as well as the CONTAINSTABLE, and FREETEXTTABLE functions),
returning a list of rows that contain a word or a phrase in the columns that you are searching
on. You can perform:
Simple searches for specific words or phrases
Thesaurus searches for synonymous forms of word a search on IE might return hits
to Internet Explorer and IE (thesaurus based expansion search); a search on Bombay
might also return hits to Mumbai (thesaurus based replacement search)
26
Searches that will return all different linguistic forms of a word (called generations)
a search on bank would return hits to banking, banked, banks, banks' and banks, etc.
(all declensions and/or conjugations of the search term bank)
Accent insensitive searches a search on caf would return hits to cafe and caf
In short, SQL FTS provides a very powerful text searching mechanism. It is many
orders of magnitude faster than a LIKE operator; especially for larger tables. This is because
when you use the LIKE operator it does a byte-by-byte search through the contents of a row
looking for a match to the search phrase. SQL FTS references the full text index to instantly
return a list of matches. SQL FTS also supports a large number of different languages and
language characters. As well as accents, it also seamlessly handles: compound words (in
German and Chinese) and compound characters which occur in Chinese.

Dataflow Diagram
Data flow is the one of the best ways of documenting the entire functionality of the
system. For the system, which will have some data flows in and have some processing inside
and then some data flow out from the system can be documented or represented effectively
by means of data flow diagrams. The data flow diagrams are a diagrammatic representation
of the system, which has input, process and outputs. Once any system is represented using a
data flow diagrams we can identify the following things easily:
Various entities interacting with the system are identified
Flow of data from one entity to another is identified
The various processes involved in between the interaction of two or more entities in
the system are clearly pointed out.
The various data stores, which hold the data in between the processes, are clearly
identified.
Data flow diagrams are excellent mechanism for communicating with customers
during requirement analysis. They are also widely used representation of external and top
level internal design specifications. In the later situations, naming conventions and name of
system components such as a subsystem, files and data links. Some Data Flow Diagram
charting forms:



27
External source or receiver

A source or sink is a person or part of organization, which enters or receives
information from the system, but is considered to be outlining the contest of data flow model.

Transform process

A process represents transformation where incoming data flows are changed into
outgoing data flows.

Data store

A data store is a repository of data that is to be stored for use by one or more process
may be as simple as buffer or queue or sophisticated as relational database. They should have
clear names. If a process merely uses the content of store and does not alter it, the arrowhead
goes only from the store to the process. If a process alters the details in the store then double-
headed arrow is used.

Data flow

A data flow is a route, which enables packets of data to travel from one point to
another. Data may flow, with arrowhead pointing in the direction of the flow.

A level 0 DFD, also called a context level, represents the entire software elements as a
single bible with input and output indicated by incoming and outgoing arrows respectively.
28
Additional process and information flow parts are represented in the next level i.e. Level 1
DFD. Any process, which is complex in Level 1, will be further represented into sub
functions in the next level .i.e. Level 2. DFD is a means of representing a system at any level
of detail with a graphic network of symbols showing data flows, data stores, data process,
sources or destination.
The DFD is designed to aid communication. DFD shows the minimum contents of
data stores. In order to show what happens within a given process, then the detailed explosion
of that process is shown. The DFD methodology is quite effective, especially when the
required design is unclear and the user and the analyst need a notational language for
communication.

Context Diagram:

The top-level diagram is often called a context diagram. It contains a single process,
but it plays a very important role in studying the current system. The context diagram defines
the system that will be studied in the sense that it determines the boundaries. Anything that is
not inside the process identified in the context diagram will not be part of the system study.
It represents the entire software element as a single bubble with input and output data
indicated by incoming and outgoing arrows respectively.

Types of Dataflow Diagrams

Physical DFD
Structured analysis states that the current system should be first understand correctly.
The physical DFD is the model of the current system and is used to ensure that the current
system has been clearly understood. Physical DFDs shows actual devices, departments,
people etc., involved in the current system
Logical DFD
Logical DFDs are the model of the proposed system. They clearly should show the
requirements on which the new system should be built. Later during design activity this is
taken as the basis for drawing the systems structure charts.

29
Data flow Diagrams

LEVEL 0 DFD: Audio visual process for deaf mutes



























ADMINISTRATOR
AUDIO
VISUAL
PROCESS
USER
30
LEVEL 1 DFD: User


VALID
SEARCH




NOT VALID
IMAGE
DETAILS









SPEECH
DETAILS





PROFILE
DETAILS












USER
LOGIN-
USERNAME
&
PASSWORD

IMAGES
SPEECH
PROFILE
VERIFIES
31
LEVEL1DFD: Administrator





























Checks
ADMIN

VALIDATE
USER
DETAILS
WORDS
VISUAL

SEARCH

CHECK
SPEECH

USER
PROFILE
DETAILS
32
Input Design
Input design is a process of converting the user-oriented input into the computer base
format. The data is fed into the system simple interactive forms. The forms have been
supplied with message. So that user can enter data without facing any difficulty. The data is
validated whenever it requires in the project. This ensures that only the correct data have been
incorporated into the system.
Input design is one of the most expensive phases of the operation of computerized
system and is often the major problem of a system. A larger number of problems with a
system can usually be traced back to fault input design and method. The following are the
consideration given by the end-users for input design.
The screens should be user-friendly and easy to operate.
Proper validation of inputs to be provided.
The screens should be clear and enough information should be provided to guide the
user to enter correct data.
List of valid values for the field should be provided wherever possible.
The design decisions for handling input specify how data are accepted for computer
processing. The design of inputs also includes specifying the means by which end-user and
system operators direct the system in which the action to take. The goal of the input design is
to make the data entry easier, logical and error free. Errors in the input data are controlled by
input design. Complex name, figures etc. are avoided to make it user friendly. Security is
provided in necessary areas.
In the input design the user-oriented input are converted into computer recognizable
format. The collection of input data is the most expensive part of the system. In the terms of
equipment used, time and number of clients involved. In the input system data is accepted
and it can be readily used for data programming or can be stored in a database for further use.
The objectives of input design are as follows: -
Controlling of the input
Avoiding errors in the data
Reaping the process simple
Avoiding extra steps


The inputs in the system are of 3 types. They are,
33
External: - which are prime inputs for the system and that which comes from various
users.
Internal: - which are basically users communication with the system.
Interactive: - which are inputs entered during a dialog with the computer.
In this project, the external inputs are username and password. The internal inputs are
data obtained from the table. The interactive input is the result of the login process and going
into the voice chat/conference process,

Output Design
Output design generally refers to the results and information that are generated by the
system. For many end-users, output is the main reason for developing the system and the
basis on which they evaluate the usefulness of application.
The objective of a system finds its shape in terms of the output. The analysis of the
objective of a system leads to determination of outputs. Outputs of a system can take various
forms. The most common are reports, screens displays, printed form, graphical drawing etc.
the output also vary in terms of their contents, frequency, timing and format. The users of the
output, its purpose and sequence of details to be printed are all considered. The output from a
system is the justification for its existence. If the outputs are inadequate in any way, the
system itself is inadequate.
The output design is an ongoing activity almost from the beginning of the project, and
follows the principle of form design. Computer output is the most important and direct source
of information to the user. Efficient and well defined output design improves the relationship
of the system and the user, thus facilitating decision making.
The types of outputs used in the system are,
Internal: - whose destination is within the organization and is the users main interface
with the computer.
Interactive: - which involves the user in communicating directly with the computer.
External: - whose destination is outside the organization and which require special
attention because they project the image of the organization.
In this project the internal output is the login result. Interactive output is the scheduled
report. The external output is the conference reply.



34
File Design
Data formats and record formats are the important concepts of file design. File design
includes sections of data format for each data field, selection of record format, selection of
access method and selection of the file organization.
File is the collection of logically related records. The main objective is to improve the
effective auxiliary storage and to contribute the overall efficiency of the computer program
component of the system. The auxiliary storage medium must provide efficient access to data
and minimize the need for computer program to change data format.

Code Design
The goal of coding or programming phase is to translate the design of the system
produces during the design phase into code in a given programming language, which can be
executed by a computer and that performs the computation specified by the design. For a
given design, the aim is to implement the design in the best possible manner.
The coding phase affects both testing and maintenance profoundly. There are many
different criteria for judging a program, including readability, size of the program, execution
time, and required memory. The main objectives of the coding activity are: minimize the
effort required to complete the program, minimize the number of statements, minimize the
memory required, maximize the program clarity, and maximize the output clarity. Coding
should be done in such a way that it is simple, easy to test, and easy to understand and
modify.

Coding style And Names
Some styles that were followed in coding, which resulted in producing simple
readable code they are follows. Variable names closely relate to the entity they represent and
module names reflect their activity.

Control Structures
Single entry, single exit constructs were used in the system

Information hiding
Information hiding used where ever it possible. Only the access functions for the data
structures are made visible while hiding the data structures behind these functions.

35
Module Interface
Modules with complex interfaces are broken into multiple modules with simpler
interfaces.
Program Layout
Proper indentation, blank spaces and parentheses are used to enhance the readability
of the program.
Robustness
A program is robust if it does something planned even for exceptional conditions. A
program should check for validity of inputs, where possible, and should check for possible
overflow of the data structures .The proposed system handles such situations. The program
wont crash or core dumb; it produce some meaningful message and exit gracefully.

Database Design
A database is an organized mechanism that has the capability of storing information
through which a user can retrieve stored information in an effective and efficient manner. The
data is the purpose of any database and must be protected.
The database design is a two level process. In the first step user requirements are
gathered together and a database is designed which will meet these requirements as cleanly as
possible. This step is called Information Level Design and it is taken independent of any
individual DBMS. In the second step this information level design is transferred in to a
design for the specific DBMS that will be used to implement the system in question. This
step is called Physical Level Design, concerned with the characteristics of the specific DBMS
that will be used. A database design runs parallel with the system design. The organization of
the data in the database is aimed to achieve the following two major objectives.
Data Integrity
Data Independence
The databases are implemented using a DBMS package. Each particular DBMS has
unique characteristics and general techniques for database design. When we store data in
SQL Server we store data in tables. Tables in turn are stored in databases.




36
Normalization
In the real life data exists as a collection of data. Data structuring is refined through
the process called normalization. The basic objective of normalization is to reduce the data
redundancy. That means that, information is stored only once there are several forms, and
they are as follows:

First normal form
A relation is to be in first normal form if and only if all the attribute values are atomic.
In the first normal form,
All the key attributes are defined.
There are no repeating groups in the table .In other words, each row/column
intersection can contain are and only not have values.
All the attributes are dependent on the primary key.

Second normal form
To be in second normal form a table must be in first normal form and no attribute of
the table should be functionally dependent an any part of candidate key.
A table is in second normal form if:
It is in INF
It includes no PARTIAL DEPENDENCIES; that is no attribute is dependent and a
portion of the primary key

Administrator Table
This table is maintained to have a detail of Administrator system will allow to login
and configure port.







37
Table Name: Serverlogin
Description: It stores the user name and password

Field Name Data Type Description
User Name Text User Name of Admin
Password Text Password of Admin
Ip-Address Text Ip address of system
Port Text Port of system



















38
V. SYSTEM TESTING AND IMPLEMENTATION

System testing is the stage of implementation, which is aimed at ensuring that the
system works accurately and efficiently before live operation commences. For any software
that is newly developed, primary importance is given to testing the system .It is the last
opportunity for the developer over to the customers.
Testing is the process by which a developer will generate a set of test data, which
gives maximum probability of finding all types of errors that can occur in the software. The
candidate system is subject to a variety of tests: online response, volume, stress, recovery &
security and usability tests. A series of testing are performed for the proposed system before
the system is ready for user acceptance testing.
It is the process of exercising or evaluating a system by manual or automatic means to
verify that it satisfies the specified requirements or to identify the difference between
expected and actual results. The testing activities are aimed at convincing the customer
through demonstration and actual use that the software is a solution to the original problem
and that both the product and the process that created it are of high quality. It is also used to
find and eliminate any residual errors from previous stages and the operational reliability of
the system.

Preparation of Test Data

Software testing is a crucial element of software quality assurance and represents the
ultimate review of specification, design and coding. Testing represents an interesting
anomaly for the software. During earlier definition and development phases, it was attempted
to build software from abstract concepts to tangible implementation. The testing responsible
for ensure that the product that has built performs the way that the detailed design
documentation specifies.

Goals and objectives
The main purpose of testing an information system is to find the errors and correct
them. The scope of system testing should include both manual and computerized operations.
System testing is comprehensive evaluation of the programs, manual procedures, computer
operations and controls.
39
System testing is the process of checking whether the developed system is working
according to the objective and requirement. All testing is to be conducted in accordance to the
test conditions specified earlier. This will ensure that the test coverage meets the
requirements and that testing is done in a systematic manner.

Testing Objectives:
Testing is a process of executing a program with the intent of finding many errors as
possible. So, the main objective is to design tests that systematically uncover different classes
of errors using minimum time and effort. Successful testing uncovers errors in software. It
also shows that the software functions are working according to specifications. Also, the data
collected during testing provides an indication of software reliability and software quality.

Statement of scope
The strategy for system testing integrates system test cases and design techniques into
a well-planned series of steps that result in the successful construction of software. The
testing must co-operate with test planning, test case design, test execution and the resultant
data collection and evaluation. A strategy for software testing must accommodate low level
test and that are necessary to verify that a small code segment has correctly implemented as
well as high level test that validate major system functions against user requirements.
Software testing is a critical element of software quality assurance and represents the
ultimate review of specification design and coding. A series of testing is performed for the
proposed system before the system is ready for acceptance testing.


Major Constraints
All tests should be traceable to the customer requirements. According to the customer,
the most severe defect is that which causes the program to fail to meet its requirements. Tests
should be planned long before the actual testing begins. All tests should be planned and
designed before any code is generated.
Testing should begin in the small and progress towards testing in the large. The
first tests focus on individual components. As testing progresses, focus shifts to integrated
clusters of components and then finally to the entire system.
40
Exhaustive testing is not possible. The number of path combinations in even a small
program is very large. So it is not possible to test all these paths. But it is possible to test the
program logic and ensure that all conditions have been met.
To be most effective, testing should be conducted by an independent third party. The
software engineering who created the program is not the best person to conduct tests for the
software. So, in order to find the maximum number of errors in the software, an independent
third party (who had no hand in developing the software) is preferred.
There are several rules that can serve as testing objectives
A good test is not redundant. Testing time and the resources are limited. So, a test that
has the same purpose as another test need not be conducted. Every test should have a
different purpose.
A good test should be best of breed. There can exist a group of tests having the same
intention. In such cases, only a subset of these tests is used. Thus, the test that has the
highest chance of uncovering a whole class of errors should be used.
A good test should neither be too simple nor be too complex. It is possible to combine
a series of tests into one test. But this can lead to masking certain errors. Hence all the
tests should be executed separately.
Testing is vital to the success of the system. A system is generally tested in a
hierarchical fashion starting at the bottom and working up. First each program is tested; next
a series of modules is tested; then each individual program with all its modules; finally the
entire system consisting of a series of programs is tested. In this way, problems at the module
level can be corrected before programs are tested and problems at the program level can be
corrected before the entire system is used. A series of tests are performed before the system is
ready for user acceptance testing.

Testing Methods
Testing is the process of finding bugs in a program. It helps to improve the quality of
the software. It has to be done thoroughly and with the help of specialist testers. System
testing is a process of checking whether the developed system is working according to the
original objectives and requirements. The system should be tested experimentally with test
data so as to ensure that the system works according to the required specification.
Testing principles are:
Tests are traceable to customer requirements.
41
80% of errors will likely be traceable to 20% of program modules.
Testing should begin in-small and progress towards testing in large.
There are many approaches to software testing but effective testing of complex
products is essentially a process of investigation, not merely a matter of creating and
following wrote procedure. One definition of testing is "the process of questioning a product
in order to evaluate it", where the "questions" are things the tester tries to do with the product,
and the product answers with its behavior in reaction to the probing of the tester.
The code testing strategy checks for the correctness of every statement in the
program. To follow this testing strategy, there should be test cases that result in execution of
every instruction in the program or module; that is every path in the program is tested. The
test cases should be guarantee that independent paths within a module are executed at least
one. Exercise all logical decision on their true or false sides. Execute all loops at their
boundaries and within their operational bounds. This testing strategy, on the face of it, sounds
exhaustive. If every statement in the program is checked for its validity, there doesnt seem to
be much scope for error.
The testing steps are:
Unit Testing
Integration Testing
Validation Testing
System Testing
Output Testing
Unit Testing
It is the process of taking each program module and run it in isolation from the rest of
the modules, by using prepared inputs and comparing the actual results with the results
predicated by the specifications and design of modules. This enables the tester to detect errors
in coding and logic that are contained within that module alone.
The software units in a system are modules and routines that are assembled and
integrated to perform a specific function .Unit testing focuses first on modules, Independently
of one another, to locate errors. This enables, to detect errors in coding and logic that are
contained within each module. This testing includes entering data and ascertaining if the
value matches to the type and size supported by java. The various controls are tested to
42
ensure that each performs its action as required. This is known as Module testing. This
testing is carried out during programming stage.
Project aspect: Front-end design consists of various forms. They are tested for data
acceptance. Similar the back-end that is database was also tested for successful acceptance
and retrieval of data. It first checks the design module, to conform all the graphical animated
images are working properly. Then it checks the dictionary module to conform all the phrases
or words available. Speech module checks the texted data are pronounced correctly. It also
check still images and turnoff are working properly

Integration Testing
It is the schematic technique for constructing the program structure while at the same
time conducting tests to see uncovered errors associated with interfacing .It also tests to find
discrepancies between the system and its original objective, current specifications and
systems documentation. The primary concern is the compatibility of individual modules.
Data can be lost across any interface, one module can have an adverse effect on another, and
sub-functions when combined may not produce the desired major functions. Integration
testing is a systematic testing to discover errors associated within the interface. The objective
is to take unit tested modules and build a program structure. All the modules are combined
and tested as a whole.
We followed bottom-up integration testing. Bottom up integration testing as its name
implies begins construction and testing with atomic modules. Because components are
integrated from the bottom up, processing required for components subordinate to a given
level is always available and the need for stubs is eliminated. The bottom up integration
testing is done from the fault and fault free module is integrated with work stealing module.
Here we check from the design phase and integrate it with the sign language and then it
integrates with dictionary formation. Finally integrates with speech module and tested.
Validation Testing
Validation testing can be defined in many ways, but a simple definition is that
validation succeeds when the software functions in manner that is reasonably expected by the
customer. Software validation is achieved through a series of black box tests that demonstrate
conformity with requirement. After validation test has been conducted, one of two conditions
exists.
43
The function or performance characteristics confirm to specifications and are
accepted
A validation from specification is uncovered and a deficiency created.
Deviation or errors discovered at this step in this project is corrected prior to
completion of the project with the help of the user by negotiating to establish a method for
resolving deficiencies. Thus the proposed system under consideration has been tested by
using validation testing and found to be working satisfactorily.
System Testing
System testing of software or hardware is testing conducted on a complete, integrated
system to evaluate the system's compliance with its specified requirements. System testing
falls within the scope of black box testing, and as such, should require no knowledge of the
inner design of the code or logic.

Black Box Testing
Black box testing takes an external perspective of the test object to derive test cases.
These tests can be functional or non-functional, though usually functional. The test designer
selects valid and invalid inputs and determines the correct output. There is no knowledge of
the test object's internal structure.
This method of test design is applicable to all levels of software testing. The higher
the level, and hence the bigger and more complex the box, the more one is forced to use black
box testing to simplify. Black box testing also called behavioral testing, focuses on the
functional requirements of the software. That is, black box testing enables the software
engineer to derive sets of input conditions that will fully exercise all functional requirements
for a program. Black _ box testing attempts to find errors in the following categories:
Incorrect or missing functions
Interface errors
Errors in data structure or external database access
Behavior or performance errors
Initialization and error termination.


44
White Box Testing

White box testing (clear box testing, glass box testing, transparent box testing, and
translucent box testing or structural testing) uses an internal perspective of the system to
design test cases based on internal structure. It requires programming skills to identify all
paths through the software. The tester chooses test case inputs to exercise paths through the
code and determines the appropriate outputs.
While white box testing is applicable at the unit, integration and system levels of the
software testing process, it is typically applied to the unit. While it normally tests paths
within a unit, it can also test paths between units during integration, and between subsystems
during a system level test.
Typical white box test design techniques include:
Control flow testing
Data flow testing
Branch Testing

Output Testing
After performing the validation testing, the next step is output testing of the proposed
system since no system could be useful if it does not produce the required output in the
specific format.
The asking the user about the format required by them tests the outputs generated or
displayed by the system under considered in to one ways is on screen. The acceptance format
on the screen is found to be correct as format was designed in the system design phase
according to the user needs. Hence output testing does not results in any correction in the
system.
Acceptance Testing
User Acceptance Testing is a critical phase of any project and requires significant
participation by the end user. It also ensures that the system meets the functional
requirements.

Implementation
Implementation is a stage where theoretical design is turned to the working system.
The implementation phase is used to test the developed package with sample data, correcting
45
the error identified, appearing the user of the various special facilities and features of the
computerized system. It also involves user training for minimizing resistance to change and
giving the new system a change to prove its worth. The successful implementation of the new
system depends upon the involvement of the user.

Implementation Methods

There are several methods for handling the implementation and consists for changing
from the old to the new computerized system. The most secure method for conversion from
the old system is to run the old and new system in parallel .In this approach; a person may
operate in the manual processing system as well as start operating the new computerized
system.
Another commonly used method is a direct cut over the existing manual system to
the computerized system. The change may be within a week or a day. This strategy requires
planning.
A working version of the system can also be implemented in one part of the
organization and the changes can be made as and when required, but this method is less
preference due to the loss of entire system. After the system is Implementation, a review
should be conducted to determine whether the system is meeting expecting where
improvements are needed.

Implementation Plan

Implementation plan includes a description of all activities that must occur to
implement the new system and to put into operation. It defines the personal responsible for
the activities and prepares a time chart for Implementation the system. The Implementation
plan should anticipate possible problems and must be able to deal with them. The usual
problem may be missing documents, missed data formats between current and new files,
errors in data translation, missing data etc.

Documentation
The documentation involves collecting, organizing, and maintaining complete record
of programs. The documentation deal with the system department with maximum clarity.
Each and every process is explained in detail. The various table used by the system with field
46
details are provided. The system uses various kinds of forms to produce well-structured
screen formats. These forms are also documented .the output generated by the system
constitutes another part.
Documentation of the software provides the following:

Comments:
Comments are very useful in documenting a program. It is used to explain logic of the
program. It should be used to improve the quality and understandability of the program. It
should not be redundant, incorrect or incomplete.

System Manuals:
A good software system must contain standard system manuals. In this the statement
is clearly defined, specifies description, detailed flowcharts, and specimen of all input forms
and printed outputs.

Operation Manual:
A good software package is supported with a good operation manual to ensure the
smooth running of the program.
The operation manual must contain the following information:
Setup and operational details of each program.
Loading and unloading procedures.
Starting, running, and terminating procedures.
List of error conditions with explanations.







47
VI. SYSTEM SECURITY
System security is a branch of technology known as information security as applied to
computers and networks. The objective of system security includes protection of information
and property from theft, corruption, or natural disaster, while allowing the information and
property to remain accessible and productive to its intended users. The terms system security,
means the collective processes and mechanisms by which sensitive and valuable information
and services are protected from publication, tampering or collapse by unauthorized activities
or untrustworthy individuals and unplanned events respectively. The technologies of system
security are based on logic. As security is not necessarily the primary goal of most computer
applications, designing a program with security in mind often imposes restrictions on that
program's behavior. There are 3 categories of control in the data security: Physical Security,
Database Security and Control Measures. After system security risks have been evaluated, the
next step is selected security measures. The measures are:
Identification: It is the scheme for identifying persons to the system based on the
password.
Access Control: Controlling access to the computer facility is secured through
encoded cards or similar device.
Audit Controls: Audibility must be supported at all levels of management. Audit
controls protect a system from external security breaches.
System Integrity: This line of defense safeguards the functioning of hardware,
database, software, physical security and operating procedures

Checks and Controls

When developing or acquiring software applications, it is important to ensure that the
data being entered is properly checked. This Activity presents guidelines on how to check
and control data entry.

Good Practices and Recommendations
The following types of checks and controls are important to have in the data entry
screens in all software applications:
48
Validate all fields that have ranges such as dates or amounts
Try to increase the number of lookup tables so that users do not enter country codes or
currencies whichever way they wish.
Allow the user, under privilege control, to add a parameter that is not in a lookup table
on the spot without having to go to another screen.
Allow the user to search for major tables such as citizens, projects, contractors, Etc.
This should be available during deletions, modifications, printing and other system
functions.
Design screen layouts to be similar to actual vouchers. This eases data entry and
requires less training for the user
Use clear color coding as per Windows standards: Black labels, White for enterable
fields and Grayed fields for non-enterable or for system responses
Differentiate between Info, Error and Warning messages through the proper use of
buttons: Info (OK), Error (OK), and Warning (Yes, No), Choices (Yes, No, Cancel).
Use clear and unambiguous messages
Avoid cluttering the screen with a large number of fields. It becomes difficult to
visually scan the screen and validate the data. In the case of large number of fields, it is best
to use TABs or even multiple screens.
Do not allow the system to accept to create or modify a record unless all data is
validated. Many systems suffer from temporary entries that are never completed.
The above guidelines should be standardized across various applications to ensure that users
get familiar with the look and feel of applications and hence require less training.

Data Security
Data security is the practice of keeping data protected from corruption and
unauthorized access. The focus behind data security is to ensure privacy while protecting
personal or corporate data. Data is the raw form of information stored as columns and rows in
our databases, network servers and personal computers. This may be a wide range of
information from personal files and intellectual property to market analytics and details
intended to top secret. Data could be anything of interest that can be read or otherwise
interpreted in human form.
Encryption has become a critical security feature for thriving networks and active
home users alike. This security mechanism uses mathematical schemes and algorithms to
49
scramble data into unreadable text. It can only by decode or decrypted by the party that
possesses the associated key.
Full-disk encryption (FDE) offers some of the best protection available. This
technology enables you to encrypt every piece of data on a disk or hard disk drive. Full disk
encryption is even more powerful when hardware solutions are used in conjunction with
software components. This combination is often referred to as end-based or end-point full
disk.
Authentication is another part of data security that we encounter with everyday
computer usage. Just think about when you log into your email or blog account. That single
sign-on process is a form authentication that allows you to log into applications, files, folders
and even an entire computer system. Once logged in, you have various given privileges until
logging out. Some systems will cancel a session if your machine has been idle for a certain
amount of time, requiring that you prove authentication once again to re-enter. The single
sign-on scheme is also implemented into strong user authentication systems. However, it
requires individuals to login using multiple factors of authentication. This may include a
password, a one-time password, a smart card or even a fingerprint.
Data security wouldn't be complete without a solution to backup your critical
information. Though it may appear secure while confined away in a machine, there is always
a chance that your data can be compromised. You could suddenly be hit with a malware
infection where a virus destroys all of your files. Someone could enter your computer and
thieve data by sliding through a security hole in the operating system. Perhaps it was an
inside job that caused your business to lose those sensitive reports. If all else fails, a reliable
backup solution will allow you to restore your data instead of starting completely from
scratch.

User Security
User security use security rules to determine what it displays. It has two elements:

Authentication
Ensures that a valid user is logged-in, based on an ID and password provided by the
user. ColdFusion (or, in some cases if you use web server authentication, the web server)
maintains the user ID information while the user is logged-in.


50
Authorization
Ensures that the logged-in user is allowed to use a page or perform an operation.
Authorization is typically based on one or more roles (sometimes called groups) to which the
user belongs. For example, in an employee database, all users could be members of either the
employee role or the contractor role. They could also be members of roles that identify their
department, position in the corporate hierarchy, or job description. For example, someone
could be a member of some or all of the following roles such as Employees, Human
Resources, Benefits, and Managers. You can also use the user ID for authorization.

Authenticating users
You can use either, or both, of the following forms of authentication to secure your
ColdFusion application:
Web server authentication, where the web server authenticates the user and does not
allow access to the website by users without valid login IDs
Application authentication, where the ColdFusion application authenticates the user
and does not allow access to the application by users without valid login IDs














51
VII. POST IMPLEMENTATION

Post implementation phase is the phase, which measures the systems performance
against predefined requirements. This involves evaluation, maintenance and enhancement of
the system.
After the system Implementation, a review should be conducted to determine whether
the system is meeting expecting where improvements are needed. System quality, user
confidence and operating system static are accessed through such technique event logging.
Impact evaluation and attitude survey.
The Implementation plan should anticipate possible problems and must be able to deal
with them. The usual problem may be missing documents, missed data formats between
current and new files, errors in data translation, missing data etc. The Implementation plan
should anticipate possible problems and must be able to deal with them.
The reviews are conducted by the operating personnels as well as the software developers in
order to determine how well the system is working, how it has been accepted and whether
adjustments are needed. The review analysis the option of the user and identifies their attitude
towards the new computerized system.

System Evaluation

The system evaluation involves the hardware and software as a unit. The hardware
selection is based on performance categories. The evaluation phase ranks vendor proposal
and determines the one suited to the users needs. It looks in to items such as price,
availability and technical support.
In the operation phase, the system performance must be monitored not only to
determine whether or not they perform as planned, but also to determine if they should be
modified to meet changes in the information needs of the business.
In the evaluation phase, the first step adopted was to look at the criteria listed earlier
and rank them in the order of importance. Three sources of information are used in evaluating
hardware and software. They are benchmark program, experience of other users and product
reference manuals.



52
Maintenance

Software maintenance is the modification of a software product after delivery to
correct faults, to improve performance or other attributes, or to adapt the product to a
modified environment. Maintenance covers a wide range of activities, including correcting,
coding and design errors, updating documentation and test data and upgrading user support.
Maintenance means restoring something to its original condition.
Maintenance can be classified as corrective, adaptive, perfective and preventive.
Corrective maintenance means repairing processing or performance failures or making
changes because of previously uncorrected problems or false assumptions. Adaptive
maintenance means changing the program function. Perfective maintenance means enhancing
the performance or modifying the programs to respond to the users additional or changing
needs. Preventive maintenance concerns activities aimed at increasing the systems
maintainability, such as updating documentation, adding comments, and improving the
modular structure of the system
The six software maintenance processes as:
The implementation process contains software preparation and transition activities,
such as the conception and creation of software maintenance plan, the preparation for
handling problems identified during development, and follow-up on product
configuration management.
The problem and modification analysis process, which is executed once the
application has become the responsibility of maintenance group. The maintenance
programmer must analyze request, confirm it and check its validity, investigate it and
purpose a solution, document the request and solution proposal, and, finally, obtain all
the required authorization to apply the modifications
The process considering the implementation of the modification itself
The process acceptance of the modification, by checking it with the individual who
submitted the request in order to make sure the modification provided a solution.
The migration process (platform migration) is exceptional, and is not part of daily
maintenance tasks. If the software must be ported to another platform without any
change in functionality, this process will be used and a maintenance project team is
likely to be assigned to this task.
53
Finally, the last maintenance process, also an event that does not occur on a daily
basis, is the retirement of a piece of software.

Activities of a Maintenance Procedure

Maintenance activities begin where conversion leaves off. Maintenance is handled by
the same planning and control used in a formal system projects. The maintenance staff
receives a request for service from an authorized user, followed by a definition of the
required modifications.
The source program and written procedures for the system acquired from the
programming library. Program changes are then tested and submitted to the user for approval.
Once approved, the modified documentation is filled with the library and a project
completion notice is sent to the user, signaling the termination of the project .Although
software doesnt wear out like a piece of hardware, it ages and evenly fails to perform
because of cumulative maintenance .A major problem with the software maintenance is its
labor-intensive nature and therefore the likelihood of errors.


















54
VIII. CONCLUSION
This project is supposed to lend a hand to the people who are suffering from
disabilities such as hearing impaired people and blind students. The system displays animated
graphical images along with texted phrase mainly useful for hearing impaired people. It also
contains speech function which provides correct pronunciations of words. The user can
choose voice type by selecting male or female voice. This software system is easily
understandable and sort many questions. This software is mainly used as a teaching technique
and work best in limited environment. This software also has the capability of taking still
images or displaying images and also has the option for choosing presentation. The user can
choose any option from the homepage itself. The user can share the information from one
system to another system. This system is user friendly and easy to install. It saves time by
avoiding searching of data through Google.

Findings
The current developed system is found to be working accurately. It is tested for its
effectives, flexibility, accuracy and user friendly. The system is found to be delightful
running under the single window system.
The programming techniques used in the design of the system provides a scope for
further expansion and implementation of any changes which may occur in future. The system
has been tested with all sample data covering all possible options for each function. Its
performance is satisfactory the system is under implementation. This system is easily
understandable and very useful for those who are suffering from hearing impaired and blind.
The user can share the information from one system to another and also have the capability to
take webcam option. User can take picture by this option.

Limitation
The system is developed in a modular fashion. Changes and additions can be easily
made. The main limitation is this software works in limited environment. The students who
has suffering from hearing impaired has limited to study as compared with normal students.
The present system does not networking. This feature can be added to the system.


55
Scope for the future Enhancement
The system is developed in a modular fashion. Changes and additions can be easily
made. The present system does not networking. This feature can be added to the system.

In future, we can include recognization of sound for blind students.
In future, this application can be accessed as online, so that all users can access
through online.
More advanced images can be included for deaf students




















56
IX. BIBLIOGRAPHY

Books and Articles
Joel Murach and Anne Boehm, Murachs C# 2012, Mike and Murach Publishers,
California: 2013.
Anne Boehm and Doug Lowe, Murachs C#.NET 4.0 Web Programming, Murach
Publishers, California: 2012.
Larry L Peterson and Brace S.Davie, Computer Networks, A Systems Approach, 3
rd

edition, Prentice Hall Pvt.Ltd, New York: 2011.
Michael Halvorson, Microsoft Visual Basic .NET, Deluxe Learning Edition,
Microsoft Press, New York: 2012.

Web Links

John Zukowski ,The C#.net Reference, O'Reilly & Associates, Inc. ,April, 1997
<http://www.codeproject.com/script/Answers/List.aspx>.17 August, 2013.
David Flanagan, C#.net in a Nutshell: A Desktop Quick Reference for .Net Programmers,
Inc.: USA, May: 1996.
<http://www.dreamincode.net/forums/topic-java-login-form>.17 September, 2013.










57
X.ANNEXURE

Screenshots

Login Page



Home Page




58
















59









Speech Information




60







Presentation Option



61

Vous aimerez peut-être aussi