Vous êtes sur la page 1sur 41

Secure Backup Network

Main project 2012-2013

INTRODUCTION

GPTC Kunnamakulam

Computer Engineering

Secure Backup Network

Main project 2012-2013

We know, most people in the Internet have a flat rate and they have hard disk much bigger than they ever use them. If you don't work with video application, its really hard to fill modern Tera byte hard disks. I think, it may be a very good idea to implement a Backup system based on Network. You split your Hard disk in two Partitions. The working Partition and a two times bigger one to have space for the backup of other people via Network. All you need is a program, who backups your files to the free net. Because of the content related key and encryption, Network stores equal content to the same location. This creates a lot of space be avoiding the infinite duplication of common content like windows or other data applications and operating systems. This space can be used to store different versions of file who are user specific and under evolution. Also need is some additional forward error recover, called save set envelope, to keep the probability below a defined value. The private files are secure, because the content has key is better than every content encryption password. Only if the file is known, it is possible by the neighbor peers to recognize the presents of this file during backup. This problem may be solved by the onion routing at the lead in of Network requests. The directories are also stored with the content related key and encryption system, so identically directories within distributed software are also stored on same locations. The main Dataset of the Backup system to needed start a recovery will be stored local on the computer and additional under a KSK who contains data to handle to an USK. This KSK must be a derivate build from a long user password, to create the opportunity to recover every time the whole hard disk with nothing more than a new computer and the knowledge of the master password. 1.1 Purpose of the Project Disasters both natural and human-caused can threaten your precious files at any time: a fire, power surge, or leaking pipe could fry your system. Even without suffering a calamity, there are plenty of other threats to locally stored datahard drive failure, accidental erasures, or a lost or stolen laptop could make you a victim of data loss. By data,

GPTC Kunnamakulam

Computer Engineering

Secure Backup Network

Main project 2012-2013

here, we mean things like your irreplaceable family photos, videos, and music as well as documents. Small businesses rely on the availability of their data to keep running, so data loss is arguably more catastrophic for them than for consumers. In fact, a recent study of small businesses had suffered data loss: "The top causes of small business' data loss included hardware/software failure (54 percent), accidental deletion (54 percent), computer viruses (33 percent) and theft (10 percent)," . Secure Network Backup System, securely store your files away from your premises at on-site/off-site server locations, your data will stay intact and available even if your local disks are stolen or your premises suffer some disaster. With more and more emphasis on "cloud computing," it only makes sense that backup should take advantage of this hot trend in technology. 1.2 Existing System This involves the initial investigation of the structure of the system,which is currently iv use , with the objective of identifying the problem and difficulties with the existing system. The major steps involved in this phase included defining the user requirements and studying the present system to verify the problem. The performance expected by the new system was also defined in this phase in order to meet the user requirements.The information gathered from various documants were analyzed and evaluated and finding reviewed in order to establish specific objectives.

1.3 Proposed System The project is a software system where Back-ups are done automatically over the network to a back-up server at regular intervals and system administration done periodically so that data security is maintained. It is a software package that provides centrally managed, reliable backup facilities for a variety of workstations. The programs permit you (or the system administrator) to manage backup, recovery, and verification of computer data across a network of computers of different kinds. In technical terms, it is a network Client/Server based backup program. The Backup System is relatively easy to use and efficient, while offering many advanced storage management features that make it easy to find and recover lost or damaged GPTC Kunnamakulam 3 Computer Engineering

Secure Backup Network

Main project 2012-2013

files. Due to its modular design, Backup System is scalable from small single computer systems to systems consisting of hundreds of computers located over a large network. 1.4 Features of the Proposed System The system protect against every kind of data loss. This are ; User Mistakes, for example deleting something you still need Software malfunction Hardware malfunction Fire, Water, Storm Theft, Sabotage, Break-down, Robbery.

1.5 Feasibility Study All projects are feasible when given unlimited resources and infinite time. Its both necessary and prudent to evaluate the feasibility of a project at the earliest possible time. The efforts and resources spent in developing the system will be a waste if the end solution does not offer timely and satisfactory . Feasibility study is a test of system proposed regarding workability, impact on the organization ability to meet user needs, and effective use of resources. Thus when a new application is proposed, it normally goes through a feasibility study before it is approved for development. solution to its users. Feasibility and risk analysis are related in many ways. If project risk is great, the possibility of producing quality software is reduced.

1.5.1 Economical Study Economical Feasibility Study is the most frequently used method for evaluating the effectiveness of a candidate system. More commonly known as cost/benefit analysis, the procedure is to determine the benefits and savings that are expected from a candidate system and compare them with cost. This analysis phase determines how much cost is needed to produce the proposed system. This system is economically feasible since it does not require any initial setup cost, as the organization has required

GPTC Kunnamakulam

Computer Engineering

Secure Backup Network

Main project 2012-2013

machines and supporting programs for the application to execute itself. It does not need additional staffing requirements. 1.5.2 Technical Study Technical Feasibility study is performed to check whether the proposed system is technically feasible or not. Technical feasibility centers around the existing computer system (hardware, software, etc) and to what extent it can support the proposed addition. This involves financial consideration to accommodate technical enhancement. This system is technically feasible. All the data are stored in files. The input can be

done through dialog boxes which are both interactive and user friendly. Hard copies can be obtained for future use, by diverting the documents to a printer. Windows serves as the platform for the new system. 1.5.3 Operational Study Operational Feasibility study is performed to check whether the system is operationally feasible or not. Using command buttons throughout the application programs enhances operational feasibility. So maintenance and modification is found to be easier. 1.5.4 Behavioral Study Peoples are inherently resistant to change, and computers have been known to facilitate change.An estimate should be made about the creation of the user staff towards the development of a computerized system. Computer installation have something to do with turnover, transfers and changes in job status.The introduction of a candidate system requires special effort to educate,sell and train the staff for conducting the business.

GPTC Kunnamakulam

Computer Engineering

Secure Backup Network

Main project 2012-2013

Behavioral feasibility shows up to what extend the user accept the system. Computerized manual calculations and book keeping pattern simplifies works of officers and saves time, the system is feasible.

GPTC Kunnamakulam

Computer Engineering

Secure Backup Network

Main project 2012-2013

MODULE DESCRIPTION

GPTC Kunnamakulam

Computer Engineering

Secure Backup Network

Main project 2012-2013

2.1 Network Backup Server a. Client Account Authentication

This module is intended to manage and authenticate an account for each client system. Client system information after registering the account is send to the server for approval and Authentication. This assigns specific privileges for each registered client system. b. Storage Device managements

This module allocates the storage space and location in the backup server for storing the backup files that have been transferred. It also manages the temporary storage devices for secondary backup. c. Data Replication

This creates the duplicates of files needed for ensuring the security from data loss. This data can be maintained in the client system near the original data or can be transferred to the server and can be even maintained in secondary storage devices. d. FTP Server

This module follows File Transmission Protocol for the easy transfer of files from the server system to the connected client and vice versa. This overcomes complex procedures and the transmission delay of using other protocols.

2.2 Client Backup Tool a. Defining Backup Packs Manages the backup files by giving a pack name. These backup packs are described by assigning several attributes which gives the information about time, schedule and storage location of the files. b. Data security management (encryption) For the secure transmission of data, data encryption mechanisms are used for converting the data into an encrypted format which can be decrypted only by the specific client.

GPTC Kunnamakulam

Computer Engineering

Secure Backup Network

Main project 2012-2013

c. Data Compression Before transferring the files to the central server it is compressed to reduce the storage space. Thus even bulk of data can be easily transferred and saved in a minimum storage space. d. Backup File Transfer (FTP) This module is intended for the easy transfer of files from client systems to the connected server and vice versa .FTP (File Transmission Protocol) is used here which overcomes the complex procedures of other transmission protocols. e. Backup scheduling Defines a schedule for each backup packs by setting the week, day, date and time for the backup. The information such as which files are needed for backup, when the backup process will start, and where to store the files are received dynamically from the user and saved as xml file. f. Recover y management Data recovery module is used to recover the lost or damaged data from the central server when needed. The data be restored in the compressed format or we can decompress the data and download. g. Report (backup/Recovery) This module generates the reports about backup and recovery .it also provide information about the registered client systems and the types of data, date and time of backup created by them.

GPTC Kunnamakulam

Computer Engineering

Secure Backup Network

Main project 2012-2013

SYSTEM DESIGN DETAILS

GPTC Kunnamakulam

10

Computer Engineering

Secure Backup Network

Main project 2012-2013

Design is a multi steps process that focuses on data structure, software, software architecture, external details and interface between the modules. The design processes also translate the requirements into representation of software that can be accessed for quality before coding begins. Computer software designs changes continually as new methods, better analysis and broader understanding evolve. Software design is at a relatively early stage in its revolution. Therefore, software design methodology locks the depth, flexibility and quantitative nature that are normally associated with more classical engineering disciplines. How ever techniques for software design do exist, criteria for design qualities are available and design notation can be applied. Once software requirements have been analyzed and specified, software design is the first of three activities- Design, code, test, that are required to build and verify software. Each activities transform information in a manner that ultimately results in a validation of computer software. The importance software design can be started with a single word quality. Design is the place where quality fostered in software development. Design provides us with the representations of the software that can be accessed for quality. Design the only way that we can accurately translate a customers requirement into a finished software product or system. Without design, risk of building an unstable system exists-one that will fail when small changes are made one that may be difficult to test. 3.1 Input Design The input design is the link between the information system and the users. It comprises the directing specification and procedures for data preparations and those steps that are necessary to put transaction data into a usable form for processing data entry. The designs of inputs focuses on controlling the amount of inputs required, controlling errors, avoiding delay, avoiding extra steps and keeping the process simple. System analyst decides the following input designs details: Why data to input? What medium to use?

GPTC Kunnamakulam

11

Computer Engineering

Secure Backup Network

Main project 2012-2013

How the data should be arranged or coded? The dialogue to guide users in providing input. Methods for performing input validation and steps to follow when error occurs. Several activities have to be carried out as part of the overall input process. They

include some or all of the following stages Data recording (that is, collection of data at its source); Data transcription (that is, transfer of data to an input form); Data conversion (that is, checking the conversion); Data control (that is, checking the accuracy and controlling the flow of the data to

the computer); Data transmission (that is, transmitting or transporting the data to the computer) Data validation (that is, checking the input data by program when it enters the

computer system) Data correction (that is, correcting the errors that are found at any of the earlier

stages). 3.2 Output Design Designing computer output should proceed in an organized, well thought out manner. The term output applies to any information produced by an information system whether printed or displayed. When analyst designs computer output, they identified the specific output is needed to meet the information requirements. Computer output is the most important and direct source of information to the user. Output design is a process that involves designing necessary outputs that have to be various users according to their requirements. Efficient intelligent output design should improve the systems relationship with the users and help in decision-making. Since the reports are directly required by the management for taking decisions and to draw conclusions, they must be designed with GPTC Kunnamakulam 12 Computer Engineering

Secure Backup Network

Main project 2012-2013

utmost care and the details in the records must be simple, descriptive and clear to the user. The options for the outputs and reports are given in the systems menu. When designing output, system analyst must accomplish the following: Determine the information to present. Decide whether to display or print the information and select the output medium. Arrange the present of information acceptable format. Decide how to distribute the output to intended receipts.

3.3Design
LOGIN

Username

password

Continue

Register

This is the first page of this software. This page is used to login a user into the software. To complete the login process a user have to give its username and password. This username and password are already saved into the server side databases. This function is only allowed to the registered users. If there is a unregistered user he must register first for that there is an option/button register. If registered user after giving the username and password select option continue. While the datas are given to server,

GPTC Kunnamakulam

13

Computer Engineering

Secure Backup Network

Main project 2012-2013

look at the database whether the data are existing or not. if data is existing server allows user to login. Otherwise cant login. The informations about the users are saved in the database client .Server Compares the given username and password with the datas in the database, if a match occurs it exists, or it is a registered user.

Register Now

Name

System name

System number

Port number

Username

Password

Confirm password

Types of data

Register

Back

GPTC Kunnamakulam

14

Computer Engineering

Secure Backup Network

Main project 2012-2013

This is a registration form,if a user have to register with this software ,he must give the information such as name,system name,system

number,port,number,username,password,types of data to be backup.this informations are saved into the database client.this informations are used for authentication and identification of user.after entering the details select register option to save the details into the database.back option is used to go to the login section.

Home page

Backup manager

Data restore

reports

GPTC Kunnamakulam

15

Computer Engineering

Secure Backup Network

Main project 2012-2013

After getting the permission to login,an authorized user get in to this page.this provides three options,backup manager,data restore,reports.The first option is backup manager,it backup your important data to secure server and setup backup schedule.and the next option is data restore,which is used to restore your lost data from remote server.reports option used to view backup schedule details and reports on backup processor. Backup manager

What

When

Where

Backup

Select drive:
Files

add

Remove

Next

GPTC Kunnamakulam

16

Computer Engineering

Secure Backup Network

Main project 2012-2013

This is the main function of this software.in this section user gives details about the Data to be backup.what is the data,when backup process to be do,where the data to be stored etc.if we select the first option ,it asks the drive from which data to be take.there we selects the folder of the data.and we can see the selected data/folder near the textbox.add option adds the folder for backup process,remove button removes the folder fron the list. When click on the next button we will get an another window like this;

Enter a name

This is for providing a name for the file in which server stores the data after backup process. When we select the when option we get a window .In that window we have to add a schedule;type:backup to server or backup to client.at level 1 we adds whether backup is for daily,weekly or monthly.and also includes the time.There is a save button .If we save this details this details stored at the database backup schedule. If we select the where option we have to specify the folder ,storage location server. And there is a verify button.if we click on that buttonwe will get another window there we can see the informations added by ourselves.

GPTC Kunnamakulam

17

Computer Engineering

Secure Backup Network

Main project 2012-2013

Type:

Schedule:

Daily

At

Hrs

mint

Weekly

Monthly

Save

Remove

Edit

Next

GPTC Kunnamakulam

18

Computer Engineering

Secure Backup Network

Main project 2012-2013

Where

Location

Storage location server

Save

Verify

Next

Backup informations

Schedule

Location

Resources

GPTC Kunnamakulam

19

Computer Engineering

Secure Backup Network

Main project 2012-2013

Backup

Compression

Backup client

Translate to server

Backup new

Home

Data restore It is the next module of this software. In this there are two options restore and decompress. There we selects files for restore, and chooses the locations

Restore

Decompress

Select files for restore Choose location

Restore

Next

Home

GPTC Kunnamakulam

20

Computer Engineering

Secure Backup Network

Main project 2012-2013

Decompress

select file

Browse

Choose location

Decompression

Backup

Home

Here we selects the file used to decompress. This is the overall design of secure data backup network system. 3.4 System Architecture JAVA Java is a programming language originally developed by James Gosling at Sun Microsystems (which has since merged into Oracle Corporation) and released in 1995 as a core component of Sun Microsystems' Java platform. The language derives much of its syntax from C and C++, but it has fewer low-level facilities than either of them. Java applications are typically compiled to bytecode (class file) that can run on any Java virtual machine (JVM) regardless of computer architecture. Java is a generalpurpose, concurrent, class-based, object-oriented language that is specifically designed to have as few implementation dependencies as possible. It is intended to let application developers "write once, run anywhere" (WORA), meaning that code that runs on one platform does not need to be recompiled to run on another. Java is, as of 2012, one of the most popular programming languages in use, particularly for client-server web applications, with a reported 10 million users.[10][11] The original and reference implementation Java compilers, virtual machines, and class libraries were developed by Sun from 1991 and first released in 1995. As of May 2007, in compliance with the specifications of the Java Community Process, Sun relicensed most of its Java technologies under the GNU General Public License. Others

GPTC Kunnamakulam

21

Computer Engineering

Secure Backup Network

Main project 2012-2013

have also developed alternative implementations of these Sun technologies, such as the GNU Compiler for Java and GNU Classpath. There were five primary goals in the creation of the Java language. 1. It should be "simple, object-oriented and familiar"

2. It should be "robust and secure" 3. It should be "architecture-neutral and portable" 4. It should execute with "high performance" 5. It should be "interpreted, threaded, and dynamic" 6. One characteristic of Java is portability, which means that computer programs written in the Java language must run similarly on any hardware/operating-system platform. This is achieved by compiling the Java language code to an intermediate representation called Java bytecode, instead of directly to platform-specific machine code. Java bytecode instructions are analogous to machine code, but they are intended to be interpreted by a virtual machine (VM) written specifically for the host hardware. Endusers commonly use a Java Runtime Environment (JRE) installed on their own machine for standalone Java applications, or in a Web browser for Java applets. 7. Standardized libraries provide a generic way to access host-specific features such as graphics, threading, and networking. 8. A major benefit of using bytecode is porting. However, the overhead of interpretation means that interpreted programs almost always run more slowly than programs compiled to native executables would. Just-in-Time (JIT) compilers were introduced from an early stage that compile bytecodes to machine code during runtime. Java uses an automatic garbage collector to manage memory in the object lifecycle. The programmer determines when objects are created, and the Java runtime is responsible for recovering the memory once objects are no longer in use. Once no references to an object remain, the unreachable memory becomes eligible to be freed automatically by the garbage collector. Something similar to a memory leak may still occur if a programmer's code holds a reference to an object that is no longer needed, typically when objects that are no longer needed are stored in containers that are still in use. If methods for a nonexistent object are called, a "null pointer exception" is thrown.

GPTC Kunnamakulam

22

Computer Engineering

Secure Backup Network MICROSOFT SQL SERVER

Main project 2012-2013

Microsoft SQL Server is a relational database management system developed by Microsoft. As a database, it is just a software product whose primary function is to store and retrieve data as requested by other software applications, be it those on the same computer or those running on another computer across a network (including the Internet). There are at least a dozen different editions of Microsoft SQL Server aimed at different audiences and for different workloads (ranging from small applications that store and retrieve data on the same computer, to millions of users and computers that access huge amounts of data from the Internet at the same time). SQL Server 2005 (formerly codenamed "Yukon") was released in October 2005. It included native support for managing XML data, in addition to relational data. For this purpose, it defined an xml data type that could be used either as a data type in database columns or as literals in queries. XML columns can be associated with XSD schemas; XML data being stored is verified against the schema. XML is converted to an internal binary data type before being stored in the database. Specialized indexing methods were made available for XML data. XML data is queried using XQuery; SQL Server 2005 added some extensions to the T-SQL language to allow embedding XQuery queries in T-SQL. In addition, it also defines a new extension to XQuery, called XML DML, that allows query-based modifications to XML data. SQL Server 2005 also allows a database server to be exposed over web services using Tabular Data Stream (TDS) packets encapsulated within SOAP (protocol) requests. When the data is accessed over web services, results are returned as XML. Common Language Runtime (CLR) integration was introduced with this version, enabling one to write SQL code as Managed Code by the CLR. For relational data, TSQL has been augmented with error handling features (try/catch) and support for recursive queries with CTEs (Common Table Expressions). SQL Server 2005 has also been enhanced with new indexing algorithms, syntax and better error recovery systems. Data pages are checksummed for better error resiliency, and optimistic concurrency support has been added for better performance. Permissions and access control have been made more granular and the query processor handles concurrent execution of queries in a more efficient way. Partitions on tables and indexes are supported natively,

GPTC Kunnamakulam

23

Computer Engineering

Secure Backup Network

Main project 2012-2013

so scaling out a database onto a cluster is easier. SQL CLR was introduced with SQL Server 2005 to let it integrate with the .NET Framework. SQL Server 2005 introduced "MARS" (Multiple Active Results Sets), a method of allowing usage of database connections for multiple purposes. SQL Server 2005 introduced DMVs (Dynamic Management Views), which are specialized views and functions that return server state information that can be used to monitor the health of a server instance, diagnose problems, and tune performance.[ Service Pack 1 (SP1) of SQL Server 2005 introduced Database Mirroring, a high availability option that provides redundancy and failover capabilities at the database level. Failover can be performed manually or can be configured for automatic failover. Automatic failover requires a witness partner and an operating mode of synchronous (also known as high-safety or full safety).
WINDOWS XP

Windows XP is a line of preparatory operating systems developed by Microsoft for use on general p-purpose computer systems,including home and business desktop,notebook computer and media centeres. The letter XP stands for experience Windows XP is the successor to the both Windows 2000 and Windows Me and is the first consumer-oriented operating system prodused by Microsoft to be built on the Windows NT Kernel and architecture. The most common editions of the operating systems are Windows XP Home edition ,which is targeted at home users and Windows XP Professional,which has additional features such as supports for Windows Server domains and dual processors and is targeted at power users and business clients.Windows XP Media Center Edition has additional multimedia features enhancing the ability to record and watch TV shows,watch DVDs and listen to the music.Windows XP Tablet PC Edition is designed to ink-aware Tablet PC platform.To separate 64-bit Edition for IA-64 processors and Windows XP professional x64 Edition for x86-64 processors.

GPTC Kunnamakulam

24

Computer Engineering

Secure Backup Network

Main project 2012-2013

3.5 Database Description 3.4.1 Introduction

A database is a collection of inter related data stored with minimum redundancy to serve many application.In database environment,common data are available and used by several users.Instead of each users managing their own data,authorized user shar data across application with database software managing the data as entry. 3.4.2 Database Design

Database files are the key source of information into the system. It is the process of designing database files which are the key source of information to the system. The files should be properly designed and planned for collection, accumulation, editing the required information. The objectives of the file design are to provide effective auxiliary storage and to contribute to the overall the efficiency of the computer program component of the system. In concepts of database design, there are two types of data physical data and logical data. Physical data is that which is written on those pieces of paper. Logical data are those, which are calculated based on some of the retrieved data in a certain sequence in summary form. In a computer-based data processing system, separation of physical and logical data provides the same advantages. Table 1 Backup scheduling.
Name Data Files Location Backup Type Schedule type

Schedule days Schedule time

GPTC Kunnamakulam

25

Computer Engineering

Secure Backup Network

Main project 2012-2013

Table 2 Backup process Scheduled backup

Start time End time Data count status

Table 3 Client Node Name IP Address Port No Connection status

GPTC Kunnamakulam

26

Computer Engineering

Secure Backup Network

Main project 2012-2013

DATA FLOW DIAGRAM

GPTC Kunnamakulam

27

Computer Engineering

Secure Backup Network

Main project 2012-2013

4.1 Basic DFD Notations


Data flow Diagram (DFD) or a Bubble chart is a network that describes the flow of data processes that change, or transform, data though out the system. This network is constructed by using a set of symbols that do not imply a physical implementation it is a graphical tool for structured analysis of Requirements.DFD models a system by using external entities from which data flows to a process, which transforms the data and creates, output-data-flows which go to other processes or external entities or files. Data in files may also flow to processes as inputs. This network is constructed by using a set of symbols that do not imply a physical implementation it is a graphical tool for structured analysis of Requirements.DFD models a system by using external entities from which data flows to a process, which transforms the data and creates, output-data-flows which go to other processes or external entities or files. Data in files may also flow to processes as inputs. Basic Symbols A data flow diagram illustrates the process, data stores, and external entities in a business or other system and the connecting data flows.

An arrow line depicts the flow with arrow Head pointing in the direction of the flow

A circle depicts certain pro

A rectangle depicts source or slink

GPTC Kunnamakulam

28

Computer Engineering

Secure Backup Network

Main project 2012-2013

Two parallel lines depict data store

Data flow diagram Level 0

user

Secure backup

user

Level 1 client

server

GPTC Kunnamakulam

29

Computer Engineering

Secure Backup Network

Main project 2012-2013

Level 2

client

when

where

scheduling

what backup restore backup login


restore

decom press
report

view

client node

GPTC Kunnamakulam

30

Computer Engineering

Secure Backup Network

Main project 2012-2013

SYSTEM REQUIREMENTS

GPTC Kunnamakulam

31

Computer Engineering

Secure Backup Network

Main project 2012-2013

5.1 Software Requirements


Development Platform Front-End Tool Back-End Tool Documentation : : : : Windows XP J2EE, HTML, Javascript SQL Server 2000 Microsoft Word 2000

5.2 Hardware Requirements


Processor Speed Memory Hard Disk Drive Floppy Disk Drive Key Board Monitor CD_ROM
Mouse

: : : : : : : :
:

Intel Pentium IV 2.4 GHz. 512 MB RAM 80 GB 1.44MB 104 keys 15 SVGA Digital Color Monitor 52 X
Scroll Mouse

GPTC Kunnamakulam

32

Computer Engineering

Secure Backup Network

Main project 2012-2013

SYSTEM IMPLEMENTATION AND TESTING

GPTC Kunnamakulam

33

Computer Engineering

Secure Backup Network

Main project 2012-2013

7.1 System Testing


Software testing is a critical element of software quality assurance and represents the ultimate reviews of specification, design and coding. Testing present an interesting anomaly for the software. Testing is vital to the success of the system. Errors can be injected at any stage during development. System testing makes a logical assumption that if all the parts of the system are correct, the goal will be successfully achieved. During testing, the program to be tested is executed with set of test data and the output of the program for the test data is evaluated to determine if the programs is performing as expected. A series of testing are performed for the proposed system before the system is ready for user acceptance testing. The testing steps are: Unit Testing Integration Testing Validation testing Output Testing Acceptance Testing

7.2.1 Unit Testing Unit testing focuses verification effort on the smallest unit of the software design, the module this is known as module testing. Since the proposed system has modules the testing is individually performed on each module. Using the details design description as a guide, important control paths are tested to uncover errors within the boundary of the module. This testing was carried out during programming stage itself. In this testing step each module is found to be working satisfactorily as regards to the expected output from the module. 7.2.2 Integration Testing Data can be test across an interface, one module can have adverse effect on another, sub function when combined may not produced the desired function. Integration testing is a systematic technique for constructing the program structure while at the same time conducting test to uncover errors associated within the interface.

GPTC Kunnamakulam

34

Computer Engineering

Secure Backup Network

Main project 2012-2013

7.2.3 Acceptence Testing User acceptance of the system is key factor for the success of any system. The system under consideration is tested for user acceptance by constantly keeping in touch with prospective system and user at the time of developing and making changes whenever required 7.2.4 Functional Testing Functional testing takes an external perceptive of the test object to derive cases.These tests can be functional or non-functional,though usually functional.The test designer selects valid and invalid input and determines the correct output.There is no knowledge of the test objects internal structure. This method of the design is applicable to all levels of software testing:unit,integration,functional,system and acceptance testing.The higher the level,and hence the bigger and more complex the box,the more were forced to black box testing to simplify.Whiole this method can uncover unimplemented parts of the specification,you cant be sure that all existent paths are tested. 7.2.5 System testing System testing is testing is conducted on a complete,integrated system to evaluate the systems compliance with its specified requirements.System testing falls within the scope of black box testing,and as such,should require no knowledge of the iner deign on the code or logic. As a rule ,System testing takes, as its input,all of the integrated software components that have successfully passed integrated testing and also the software system itself integrated with any applicable hardware systems. The purpose of the integration testing is to detect any inconsistencies between any of the assembladges and the hardware.System testing more liliting type of the testing;it seeks to detect both within the inter-assemblagesand also withi the system as a whole. 7.2.6 White box testing White box testing(also known as clear box testing,glass box testing or structural testing)user an internal perspective of the system to design test cases bawse on internal structure. It requires programming skills to identity all paths through the software.The GPTC Kunnamakulam 35 Computer Engineering

Secure Backup Network

Main project 2012-2013

tester chooses test case inputs to exercise path through the code and determines the appropriates outputs.While white box testing is applicable at the unit,integration and system levels of the software testing process,its typically applied to the unit.So while it normally tests paths within a unit,it can also test paths between units during integration,and between subsystems during a system level test.Though this method of test design can overwhelming number of test cases, it might not detect unimplemented parts of the specification,you cant be sure that all paths through the test objects are executed. 7.2.7 Black box testing Black box testing takes an external perspective of the test object to derive test cases. The tests may be functional or non-functional, Though usually functional. The test designer selects valid and invalid input and determines the correct output. There is no knowledge of the test objects internal structure. This method of test design is applicable to all levels of software testing: unit, integration, functional , system and acceptance testing. The higher level and hence the bigger and more complex the box, the more were forced to use black box testing to simplify.While this method can uncover unimplemented parts of the specification,you cant be sure thet all existent paths are tested. 7.2.8 Validation testing All the end of the integration testing, software is completely assembled as a package, interfacing errors have been uncovered and corrected and final series of software validation testing begins. Validation testing can be defined in any ways, but a simple definition is that validation succeeds when the software function is a manner that can be reasonably accepted by the user.Software validation is achieved through a series of black box test that demonstrate conformity with requirements. After conditions exists:the

validation test has been completed one of the following two

function or performance characteristics confirm to specifications and are accepted. A derivation from specification is uncovered and a deficiency list is created.Derivation or errors discovered at this step in this project is corrected prior to the completion of the project. Thus, the proposed system under consideration has been tested by using validation testing and found to be working satisfactory. GPTC Kunnamakulam 36 Computer Engineering

Secure Backup Network

Main project 2012-2013

7.2.9 Output testing After performing validation testing the next step is to perform the output testing of the proposed system. Since no system could be useful ,if it does not produce the required output in the specified format. The output generated are displayed by thesystem under the consideration are tested. By the company with the format required by the user. Ahare the uoutput format is considered in two ways. One is onscreen and other is in printed format. The output format on the screen is found to be correct as the system design phase according to the user needs for the copy also, the output comes out as specified requirements by the user hance the output testing does not result in any correction in the system. 7.2.4 System Implementation Implementation is the stage of the project when the theoretical design is turned into a working system. If the implementation stage is not properly planned and controlled, it can cause chaos. Thus it can be considered to be the most crucial stage in achieving a successful new system and in giving the users confidence that the new system will work and be effective. Normally this stage involves setting up a coordinating committee, which will act as a sounding board for ideas; complaints and problems. The first task is implementation planning; i.e. deciding on the methods and timescale to be adopted. Apart from planning, the two major tasks of preparing for implementation are education and training of users and testing of the system. Education of users should really have taken place much earlier in the project; at the implementation stage the emphasis must be on training in new skills to give staff confidence they can use the system. Once staff has been trained, the system can be tested. After the implementation phase is completed and the user staff is adjusted to the changes created by the candidate system, evaluation and maintenance begin. The importance of maintenance is to continue to bring the new system to standards. The activities of the implementation phase can be summarized as: Implementation planning 37 Computer Engineering

GPTC Kunnamakulam

Secure Backup Network

Main project 2012-2013

Education and training System training

7. System Maintenance Software maintenance is the process of modifying a software system or component after its delivery in order to correct faults, improve the performance and other attributes, or to adapt to the changed environment. Maintenance covers a wide range of activities including correcting the code and design errors, updating the documentation and test data, and upgrading the user support. There is an aging process that calls for periodic maintenance of hardware and software. Maintenance is always necessary to keep the system into its standards.

GPTC Kunnamakulam

38

Computer Engineering

Secure Backup Network

Main project 2012-2013

CONCLUSION

GPTC Kunnamakulam

39

Computer Engineering

Secure Backup Network

Main project 2012-2013

All you need is a program, who backups your files to the free net. Because of the content related key and encryption, Network stores equal content to the same location. This creates a lot of space be avoiding the infinite duplication of common content like windows or other data applications and operating systems. This space can be used to store different versions of file who are user specific and under evolution. Also need is some additional forward error recover, called save set envelope, to keep the probability below a defined value. The private files are secure, because the content has key is better than every content encryption password. Only if the file is known, it is possible by the neighbor peers to recognize the presents of this file during backup. This problem may be solved by the onion routing at the lead in of Network requests. The directories are also stored with the content related key and encryption system, so identically directories within distributed software are also stored on same locations. The main Dataset of the Backup system to needed start a recovery will be stored local on the computer and additional under a KSK who contains data to handle to an USK. This KSK must be a derivate build from a long user password, to create the opportunity to recover every time the whole hard disk with nothing more than a new computer and the knowledge of the master password.

GPTC Kunnamakulam

40

Computer Engineering

Secure Backup Network

Main project 2012-2013

REFERENCES
[1]. omindu.wordpress.com/.../mysql-database-backup-restore-usingjava/omindu.wordpress.com/.../mysql-database-backup-restore-using-java/ [2]. www.stellarinfo.com/data-safety-eraser/.../backupdata.phwww.stellarinfo.com/data-safety-eraser/.../backup-data.ph

GPTC Kunnamakulam

41

Computer Engineering

Vous aimerez peut-être aussi