Vous êtes sur la page 1sur 52

Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Chapter 1

About the Organization


The following section describes brief history of the organization, number of people working
in the organization, operation of the company, major milestones, overall organization
structure, products & services offered by the company and financial details.

1.1 Brief history of the organization

With the active participation of its multi-disciplinary Assignment Execution Team, Logic
Mind Technologies has emerged as a leader in the ITES in India and has established itself in
the field of software development, data processing, data conversion, digital printing,
Digitization, System integration, smart card personalization, IT facility management and other
IT enabled services. Logic Mind Technologies LLP, incorporated in 2011, is a professionally
managed, rapidly growing, multifaceted Information technology company. The company is
actively involved in developing automation and e-Governance solutions for Transport, Social
Security, Citizen Identity, Education, Public Distribution System, Retail Management and a
host of other application areas.
Logic Mind Technologies is a leading System integrator in India providing complete
turnkey solutions on BOO & BOOT basis including facility management services, Smart
Cards applications, Document Management System (DMS), Work Flow Management and
Manpower Deployment.
Logic Mind Technologies has successfully completed many e-governance projects for
the various departments of Govt of AP and has won accolades for its superior service
delivery, timely execution of projects and the quality of the deliverables. LMT is being trusted
by many clients who are looking for reliable and quality services for their business. LMT is
currently operating and managing in Bangalore and giving services to e-commerce business
services.

Dept of CSE, RVCE 2017-18 1


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Logic mind technologies adopted project team and dedicated organization structure. In
project-based organization, the project manager’s directors have a high level of power to
oversee and control the project assets. The project manager in this structure has downright
power over the project and can secure assets expected to fulfill project targets from inside
then again outside the parent organization, subject just to the extension, quality, furthermore,
budget constraints are identified in the project.
In the project-based structure, staff is particularly relegated to the project and report
specifically to the project manager. The project manager is in charge of the execution
evaluation and vocation movement of all undertaking colleagues while on the project. This
prompts expanded project faithfulness. Complete line power over undertaking endeavors
bears the project manager solid undertaking controls and brought together lines of
correspondence. This prompts quick response time and enhanced responsiveness. In addition,
project work forces are held on a restrictive instead of shared or low maintenance premise.
Project teams create an in number feeling of task recognizable proof and possession, with
profound faithfulness efforts to the project and a decent comprehension of the way of project's
exercises, mission, or objectives.

1.1.1 Number of people working in the organization

Department centered development organizations start to become practical as a group grows


above 25 developers or 5 projects. At these staffing levels, there are sufficient people to form
multiple departments centered on particular software skills or life cycle areas. Since its
inceptions and with initial small steps, Logic Mind Technologies is now progressing by leaps
and bounds. It has grown from a small venture to a medium scale enterprise with a strong 80+
workforce, our rate of more than 100%. The company is executing some of the prestigious
projects and has earned a very respectable name in the Indian IT and e-commerce industry.
1.1.2 Operation of the company

Logic Mind Technologies Pvt Ltd is one of India most well-known and well-trusted
solution provider. Today, Logic Mind Technologies stands as a source of reliable and
innovative products that enhance the quality of costumer's professional and personal lives.

Dept of CSE, RVCE 2017-18 2


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Logic Mind Technologies is rooted in Bangalore and has its branch in Hyderabad and
Chennai. Logic Mind Technologies is a leading solution provider in all technologies, and has
extensive experience in research and development.
Its employees in all the branches are active in the areas of production, software development,
Implementation, system integration, and training.
Why Logic Mind Technologies?
With a client list spanning nearly in all industries, and colleges, Logic Mind Technologies
product solutions have benefited customers of many different sizes, from non-profit
organizations to companies.
By acquaintance with Logic Mind Technologies you’ll have access to current IT research,
tools, templates, and step-by-step action plans for completing Key projects. You’ll also be
provided full access to our research archives and knowledge base.

1.2 Major milestones

Since its inceptions and with initial small steps, Logic Mind Technologies is now
progressing by leaps and bounds. It has grown from a small venture to a medium scale
enterprise with a strong 80+ workforce, our rate of more than 100%. The company is
executing some of the prestigious projects and has earned a very respectable name in the
Indian IT and e-commerce industry.
Our innovative and highly integrated approach means customers benefit from working
with specialists. Our continuous strive to be a technology leader in the industry means that our
clients directly benefit from the huge expertise that our people possess. We strive to be at the
forefront of technology that enables us to provide you with highly effective and optimized
solutions to all your problems. Clients like to have a single point-of-contact for their
solutions, and expect a complete solution from the vendor, which is not possible unless there
are partnerships and alliances within and outside the company.
Logic Mind Technologies fosters partnerships with companies with whom a value
proposition can be offered to clients. One of the key benefits that you receive by partnering
with Logic Mind Technologies is increased project completion certainty, project transparency,
renewed customer confidence and credibility from our unparalleled track record, mature
processes and quality recognition and customer endorsement.

Dept of CSE, RVCE 2017-18 3


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

True certainty of success comes from working with a partner you trust to provide the insight,
support and expertise that will propel your business forward. Experiencing certainty with
Logic Mind Technologies means you can count on results, partnership and leadership. When
you work with us, your long-term success is our motivation. This is why we can offer you the
ability to meet every challenge and the agility to capitalize on every opportunity. That’s the
power of certainty. And it is our promise to every client.
Logic Mind Technologies is a global enterprise solutions provider committed to
designing and delivering solutions that enable international companies to thrive in today's
complex business environment. Logic Mind Technologies Pvt Ltd is one of India most well-
known and well-trusted solution provider. Today Logic Mind Technologies stands as a source
of reliable and innovative products that enhance the quality of customer’s professional and
personal lives. With the active participation of its multi-disciplinary Assignment Execution
Team, Logic Mind Technologies has emerged as a leader in the ITES in India and has
established itself in the field of software development, data processing, data conversion,
digital printing, Digitization, System integration, smart card personalization, IT facility
management and other IT enabled services. Logic Mind Technologies LLP, incorporated in
2011, is a professionally managed, rapidly growing, multifaceted Information technology
company. The company is actively involved in developing automation and e-Governance
solutions for Transport, Social Security, Citizen Identity, Education, Public Distribution
System, Retail Management and a host of other application areas.
Historical milestones and a variety of achievements characterize our company’s
journey: from a merchant’s company selling one product to the state player we are today.
Throughout the journey, we have seen many first-time products launch, a steady flow of
innovations, continuous expansions through growth and acquisitions. During this time,
hundreds of employees have contributed to our success, which is marked by numerous awards
and excellent third-party rankings.
Logic Mind Technologies has successfully completed many e-governance projects for
the various departments of Govt of AP and has won accolades for its superior service
delivery, timely execution of projects and the quality of the deliverables. LMT is being trusted
by many clients who are looking for reliable and quality services for their business. LMT is

Dept of CSE, RVCE 2017-18 4


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

currently operating and managing in Bangalore and giving services to e-commerce business
services.

1.3 Overall organization structure

Figure 1.1 Organization Structure of Logic Mind Technologies

The Figure 1.1 shows the organizational structure of Logic Mind Technologies(LMT). The
organization structure depicts how various work roles and responsibilities are delegated,
controlled and coordinated and determines how information flows from level to level within
the company to achieve the organizational aims.

1.4 Products and services offered by the company

Logic Mind Technologies has ready to implement solutions for e-Commerce which can be
customized and implemented at a short notice. These solutions are developed after conducting
detailed System Requirement Study (SRS) of the respective business and have stabilized by
the live implementation and are running successfully with statewide implementation.

Dept of CSE, RVCE 2017-18 5


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

The marketing department consists of Digital Marketing and Sales departments. The presales
department further contains Healthcare, Education, Retail and Networking departments. The
digital marketing departments are classified as Public relations, Analyst and Online
supermarket for the groceries.
 ONLINE SUPERMARKET FOR THE GROCERIES: ANDHRAFRESH.COM
ANDHRAFRESH.COM is an Indian online grocery provider listing over 2000+products from
more than 500 brands. It was started in Anathapur and subsequently expanded its operations
to Kurnool, Tirupati and expanding further in Bangalore, Chennai, Hyderabad, Mysore, Pune.
ANDHRAFRESH.COM product categories include Fresh Fruits and Vegetables, Grocery and
Staples, Bread, Dairy, Eggs, Beverages, Branded foods, Household Items, Personal Care,
Health care, Meat, Home & Kitchen Products, Electronics & Appliances, Cosmetics,
imported products and gourmet products. Many more categories are going to be expanded.
Andhrafresh.com also provides all fresh beverages, foods and home appliances
The below showed snapshot of andrafresh.com

Figure 1.2. Online super market for groceries

 KEY ALLIANCES/PARTENERSHIPS

Dept of CSE, RVCE 2017-18 6


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Figure 1.3. Website for key alliances

Today a client shop on the web at stands via telephone and cells quickly, its big challenge in
coordinating to the online stores; to give best shopping is very big challenge.
Logic mind technology uses advance tools to connect people to fulfill their
requirements, areas of specialty of logic mind technology are.
 Responding to customer requirements.
 Cooperating with customers.
 Giving presentation to the customers.
 Business development support.
 Supporting for market.
 Functionalist of individual

Director
Director is head of the research and development, he supervises the work done by the research
engineers analysts, project manager and other staff of the department.
The major functionalities of research and development are as follows.
 Director analysis the current market trend
 Guides the research engineers to work on current market trend
 Assigns the theoretical solutions obtained by the research engineers to software
developers for practical implementation

Dept of CSE, RVCE 2017-18 7


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

 Conducting the timely meetings with the project manager an enquire about the
progress in the project.
 Director takes the final decision regarding the budget and implementation of the
project.
 Director also gives the solution the solution to the challenge that occur during project
implementation
Research engineers
Research engineers are the people who study about the current market trend and the problems
that are present
Research engineering department is headed by a manger and this manager directly report to
the director of the department. The manager is responsible to supervise the work performed
by all other research engineers in the department the specific functionalities of a research
engineers is as follows.
 Conducting study on the problem or researching on the problem posted by the director
 Be up to date to the current market trend
 Gives possible theoretical solution to the problems
 Prepares the clear documentation about the solution and submit it to the director.
Project manager
 Project manager heads the solution developing team or team of software engineers
 The project manager directly reports to the director of the department.
 Project manager is responsible for supervising the work done by the team member
 Project manager plans to implement project within the budget
 He solves the problems that occur within the team.
 Conducts timely meetings with the software engineers and gate the details of the
progressing of the project
Shopping Solutions
Enables you to connect with your customers whenever, wherever & for whatever across the
globe faster than any other way. It also helps you in creating a global visibility/presence of the
products you wanted to sell.

Dept of CSE, RVCE 2017-18 8


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Figure 1.4. Shopping solutions

Product Management Systems


Allows you to manage your all type of inventory in a single application by enabling the In-
Stock & Out-Stock functionality including Online Account Management system in built.

Insurance Apps Solutions


Our development team has expertise in Insurance Solutions development & has a great
experience on such applications. Our 1st objective is to meet the customer requirement

1.5 Financial details

Logic Mind Technologies have improved the quality of communication and satisfied
customers. We have earned their respect by providing excellent products and services.
In addition, we are flexible with services and financial structures for contracts aiming for
mutually beneficial relationships with our customers.

Dept of CSE, RVCE 2017-18 9


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Our customers are dynamic and diverse and include Large Corporate Offices, Universities,
Educational Institutions, Factories, etc.
Since its inceptions and with initial small steps, Logic Mind Technologies is now progressing
by leaps and bounds. It has grown from a small venture to a medium scale enterprise with a
strong 80+ workforce, our rate of more than 100%. The company is executing some of the
prestigious projects and has earned a very respectable name in the Indian IT and e-commerce
industry.
Financial statement often referred to as a Statement of Profit and Loss, or P&L, this financial
report shows the revenues and expense generated and incurred by a company over a specified
period of time. It shows the net gain or loss from the company's equity position during the
stated accounting period. The financials include Income Statements, Balance Sheets,
Statements of Cash Flow and Financial Ratios both on a quarterly and an annual basis is 3
crore.

Dept of CSE, RVCE 2017-18 10


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Chapter 2

About the Department


The following section describes specific functions performed by the department, Roles &
Responsibilities of individuals in the department and Organization structure of the
department.

2.1 Specific functions performed by the department


Logic mind technologies have the ability to architect, develop and maintain any
complex software applications. Logic Mind Technologies development team and research
team is committed to continuing research and development in the rapidly evolving fields of
software development and IT, so that informed decisions can be undertaken at the appropriate
time regarding future technology choices and adoption and to help drive the continuing
evolution of our software architecture.

Over the course of several years, LMT has used the benefit of its knowledge and
experience of developing enterprise wide, web based applications coupled with its continuing
research and development activity to develop its own in-house web based software
architecture and supporting framework on which all of its current and future web based
solutions are based. Conforming to the latest industry standards and best practice, LMT’s
software architecture has proven to be a reliable, robust, and scalable foundation on which to
build its software products. A qualified and highly specialized team with multi-disciplinary
approach forms the technical core at Logic Mind Technologies. This repository of talented
and committed software developers has a proven track record to ensure success in IT solution
implementation. With skills ranging from business process re-engineering to application
development, Logic Mind Technologies technical team seeks to constantly enhance and
expand its technical knowledge. Capturing knowledge through procedures and processes is
the premise on which the entire organization works. Logic Mind Technologies resource base
consists of IIT engineers (three including the Directors), management graduates, masters in
computer applications and domain experts from various fields.

Dept of CSE, RVCE 2017-18 11


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Logic Mind technologies has a state of the art Software Development Centre located at Delhi
which also hosts a Data Centre with 100 MB bandwidth for the data transfer. The
development center and data center runs in fully secure mode.

Logic mind technology is very prompt providing projects within deadline of project, which it
improves the reliable and robustness.
Current Research and Development
 The current R & D efforts are primarily aimed at the following segments in the
healthcare industry:
 Developing a system for integrating medical schools with major hospitals for
knowledge gathering, sharing and learning
 Developing a Clinical Decision Support System to aid doctors in difficult to diagnose
cases using Artificial Intelligence and Probabilistic Techniques.
New Technology Capability and Positions
The organization has a process in place, which addresses the issue of incorporating emerging
technologies into the product design. The process is as follows:
 Core committee on new development evaluates and identifies new technology for
the purpose of integration.
 The research and development department identifies the resource and people and
formulates the process for working while setting key performance indicator.
 A thorough study of the new technology along the tools is made and documented.
 Estimates are made as to the impact of the new technology on the products
developed by the company.
 Effort estimates are made for introducing the new technologies
 Client feedback is received about the efforts needed and the advantages of the new
technology.
 The core committee takes a knowledgeable decision as to the advantages and
efforts required and approve the introduction of the technology
 The affected personnel are trained in the new technologies
 The new technology is introduced and the product is enhanced

Dept of CSE, RVCE 2017-18 12


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

 The clients are informed about the enhancement and introduction of related
documents are prepared for the changeover.
 The clients are guided in implementing the new technologies.

Being a technology driven company we are always exploring ways of enhancing our product
capabilities and aim at providing the latest state-of-the-art products to our customers. We have
incorporated the PDAs and the smart card already in the system. We are currently evaluating
blue tooth capability and the Tablet PC relevance to the field.

2.2 Roles & Responsibilities of individuals in the department


Traditional IT organizations are typically structured to support vertical business units and
applications. The roles, responsibilities, skills and budgets are focused on several discrete
projects that address specifically business activities. In the traditional IT organization, projects
are scoped and implemented without fully recognizing the core business processes that span
business units.
Without an enterprise view, organizations lose the opportunity to implement the most
effective solutions. In addition, typically the business requests enhancements to existing
applications, as a way to address immediate business needs quickly. This, along with a lack of
shared vision between IT and business areas, results in enhancements to applications without
fully considering the underlying business processes.
Marketing department advances the business and drives offers of the items and
administrations. It gives the vital exploration to recognize and target clients and different
audiences. The marketing department consists of Marketing and Sales departments. The
presales department further contains Healthcare, Education, Retail and Networking
departments. Thus, opportunities to radically improve business processes are overlooked.

Dept of CSE, RVCE 2017-18 13


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Figure 2.1. Roles and Responsibilities of individual in the organization

2.3 Organization structure of the department


Department centered development organizations start to become practical as a group grows
above 25 developers or 5 projects. At these staffing levels, there are sufficient people to form
multiple departments centered on particular software skills or life cycle areas. For instance, a
40-person group might have departments for:
 System and database administrators
 User interface programmers
 Application programmers
 Configuration management, test, and quality assurance

A common mistake in department-centered organizations is to break software architects into a


separate department or group. We have found this can lead to elitism and be very
counterproductive. First, it starts to separate the architects from the developers who are doing
the actual implementation. Architects thus become more quickly out-of-touch with the latest

Dept of CSE, RVCE 2017-18 14


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

development methodologies actually being used. Also, while every developer does not want
to be an architect, every developer likes to have some say in the design. If developers are too
separated from architects, they may have a built-in incentive to prove the architect’s design
was wrong by not working there hardest to implement it. When this happens, the architect will
most likely blame the problem on developer incompetence than on any architectural flaws.
The whole iterative development process becomes harder to implement smoothly.

Dept of CSE, RVCE 2017-18 15


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Chapter 3
Tasks Performed
The following section describes list of technical activities performed in the company, list of
non-technical activities performed in the company, tasks assigned by the company to perform
the assigned tasks, background study of Technical reports, books, journal papers, participation
in organizational meetings, time management, resource utilization and interpersonal skills &
initiatives taken.

3.1 List of technical activities performed in the company


“Technical activities” means the doing of any work by the industry itself or the use or
installation of any materials, parts or equipment that is subject to regulation by the city under
one or more of the technical codes.
Technical skills are the expertise or the technical competency that a professional has in his
area of work. To acquire the technical skills the professional needs to have a thorough
knowledge about the domain of working. Also, about the advance tools used to implement the
solutions designed to solve the problem. Some of the examples for technical skills used in IT
industry include: design, development, testing, problem analysis etc.
Technical Activities goal is to promote a profitable and sustainable business activity that
meets the customer’s needs, to increase the company's market share, to gain the competitive
edge, to increase the company's role in relations to social responsibility and to provide
excellent customer service.
There is an increased demand for the professionals who has both technical and non-technical
skills. In order to sustain in the rapidly growing technical industry, a professional should have
both technical and non-technical skills.
Technical skills are the expertise or the technical competency that a professional has in his
area of work. To acquire the technical skills the professional needs to have a thorough
knowledge about the domain of working. Also, about the advance tools used to implement the
solutions designed to solve the problem. Some of the examples for technical skills used in IT
industry include: design, development, testing, problem analysis etc.

Dept of CSE, RVCE 2017-18 16


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Following are the list of technical skills acquired:


 Understanding and describing a brief review of the objectives, Platform and Area of
Work assigned.
 Understanding the role of Software systems; existing system block diagram,
architecture and control/data flow diagram.
 Understanding and describing the detailed concept and requirements of clients and
possessing an active research work to the area I’m concerned with and its usage in the
project.
 Using Subversion Control System (SVN). A software source code Change
Management (CM) system for collaborative development. It maintains a history of file
and directory changes.
 Implementing the feature on eclipse platform using java and java script. Was involved
in the development phase where my part of the work resides on Data owner modules
respectively.
 Usage of different types of shells such as bash shell, C-shell, and K-shell. Also, Shell
Scripts. Usage of text processing commands – grep, awk and sed
 Developing installment scripts for deployment.

 Also, the use of Quick Emulator (QEMU), one of the most powerful processor
emulator as well as virtualizer, used on guest OS. Further, configured the allocated
remote machine using putty software. Mobax is another handy tool with user friendly
GUI, which is used to access the remote machine. WinSCP is another tool which can
be used to exchange the files between remote machine and local machine.
 Finally presenting the results, overall conclusions of the study and proposed work.

3.2 List of non-technical activities performed in the company


Non-Technical skills are general skills, which can also be said as life skills. Non-Technical
skills include: effective communication, working in team. Non-technical skills cannot be
acquired through reading; instead, one has to cultivate it in their day-to-day activities. It can
be improved and controlled with extensive use and experience. Non-Technical skills can be
broadly classified as follows:

Dept of CSE, RVCE 2017-18 17


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

 Functional: These are the very basic skills, which require


 Adaptive: These are the skills that characterize the employee ability to work in team,
take the initiative towards leadership and organizational ability.
Non-Technical skills acquired:
 Self-Learning: Revised the JAVA programming concepts. Studied about GNU
Compiler Collection (GCC). Analyzed the difference between different processor
architectures such as ARM and X86. Also, the instruction sets RISC and CISC. Revised the
assembly language concepts. As a part of understanding about operating system modules and
their organization, it was necessary to learn the important phases of LINUX booting sequence.
Also, the knowledge about different executable file formats such as a.out, ELF
(Executable and Link Format) and COFF (Common Object File Format).
 Presentation skills: There were weekly presentations scheduled on every Friday.
Interns have to choose the topic of interest, which will contribute to the ongoing
research work. This activity helped to improve the presentation skills and
interpersonal skills. Week by week the stage fear was alleviated. Interactions during
the presentation sessions brought me more clarity about the concepts.
 Verbal and written communication skills: Reading and summarizing the technical and
research papers was one of such activity. As a part of this activity the interns were
expected to go through the technical and research papers and summarize the work
done by the authors. This activity improved precise reading and also, clearly
communicating the concepts that are understood.
 Participation in meeting: Participation in meetings improved the listening ability;
aligning the individual work towards the accomplishment of whole project; giving
individual opinion about the discussions and convincing the team members.
 Planning the work: Ability to identify the subtasks of the assigned work and
prioritizing the subtasks to gain the small wins over the work.

Dept of CSE, RVCE 2017-18 18


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

3.3 Tasks assigned by the company to perform the assigned tasks


This chapter contains the explanation about the tasks assigned to me; both technical and non-
technical skills acquired; details of the materials referred to enrich the knowledge.

3.3.1 Provable Multicopy Dynamic Data Possession in Cloud Computing


Systems
Increasingly more and more organizations are opting for outsourcing data to remote
cloud service providers (CSPs). Customers can rent the CSPs storage infrastructure to store
and retrieve almost unlimited amount of data by paying fees metered in gigabyte/month. For
an increased level of scalability, availability, and durability, some customers may want their
data to be replicated on multiple servers across multiple data centers. The more copies the
CSP is asked to store, the more fees the customers are charged. Therefore, customers need to
have a strong guarantee that the CSP is storing all data copies that are agreed upon in the
service contract, and all these copies are consistent with the most recent modifications issued
by the customers. In this paper, we propose a map-based provable multicopy dynamic data
possession (MB-PMDDP) scheme that has the following features: 1) it provides an evidence
to the customers that the CSP is not cheating by storing fewer copies; 2) it supports
outsourcing of dynamic data, i.e., it supports block-level operations, such as block
modification, insertion, deletion, and append; and 3) it allows authorized users to seamlessly
access the file copies stored by the CSP. We give a comparative analysis of the proposed MB-
PMDDP scheme[1] with a reference model obtained by extending existing provable
possession of dynamic single-copy schemes[5].
Outsourcing data to a remote cloud service provider (CSP) allows organizations to
store more data on the CSP than on private computer systems. Once the data has been
outsourced to a remote CSP which may not be trustworthy, the data owners lose the direct
control over their sensitive data. This lack of control raises new formidable and challenging
tasks related to data confidentiality and integrity protection in cloud computing. The
confidentiality issue can be handled by encrypting sensitive data before outsourcing to remote
servers. As such, it is a crucial demand of customers to have a strong evidence that the cloud
servers still possess their data and it is not being tampered with or partially deleted over time.

Dept of CSE, RVCE 2017-18 19


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

PDP (provable data possession) is a technique for validating data integrity over remote
servers. In a typical PDP model, the data owner generates some metadata/information for a
data file to be used later for verification purposes through a challenge-response protocol with
the remote/cloud server. The owner sends the file to be stored on a remote server which may
be untrusted, and deletes the local copy of the file. As a proof that the server is still possessing
the data file in its original form, it needs to correctly compute a response to a challenge vector
sent from a verifier — who can be the original data owner or a trusted entity that shares some
information with the owner.
3.3.2 System architecture
The figure 3.1 depicts the system architecture of the components being designed as part of the
internship project.

Figure 3.1. System Architecture

Dept of CSE, RVCE 2017-18 20


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Data Owner
In this module data owner created by Cloud Provider allocating the space for storing
the owner files. The Data Owner is responsible for browsing and uploading the file to Cloud
Server. The data owner has a file consisting of blocks and the CSP offers to store many copies
of the owner’s file on different servers to prevent simultaneous failure of all copies. The
owner encrypts his data before outsourcing to CSP. After outsourcing all multiple copies of
the file, the owner may interact with the CSP to perform block-level operations on all copies.
These operations include modify, insert, append, and delete specific blocks of the outsourced
data copies. The interaction between the owner and the authorized users to authenticate their
identities and share the secret key.
Cloud Service Provider
The Cloud Server is responsible on behalf of the file content provider for both
allocating the appropriate amount of resources in the cloud, and reserving the time over which
the required resources are allocated. CSP creates virtual machine for each resource requested
by the user If uploading file Size is less than the allocated space, CSP will allocate space,
receive data from owner and stores the data. Whenever CSP get request from an authorized
user, it will send the data to users. Cloud server will store all the data owner information and
stores all the authorized user’s information and it also allows access to the information
through IP network.
CSP stores the number of copies of owner files, depends on the nature of data; more
copies are needed for critical data that cannot easily be reproduced, and to achieve a higher
level of scalability. This critical data should be replicated on multiple servers across multiple
data centers.
Authorized User
In this module, authorized user can download the file content. Before downloading
user must register first, later user can login to the cloud and download the files. User can also
view the uploaded Files and they can access the file. Authorized user everyone downloads the
files. An authorized user of the outsourced data sends a data access request to the CSP and
receives a file copy in an encrypted form that can be decrypted using a secret key shared with
the owner.

Dept of CSE, RVCE 2017-18 21


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

3.3.3 Use case diagram


A use case diagram in the Unified Modeling Language (UML) is a type of behavioral
diagram defined by and created from a Use-case analysis. Its purpose is to present a graphical
overview of the functionality provided by a system in terms of actors, their goals (represented
as use cases), and any dependencies between those use cases. The main purpose of a use case
diagram is to show what system functions are performed for which actor. Roles of the actors
in the system can be depicted. A use case diagram is a type of behavioral diagram created
from a Use-case analysis.

CSP

Store the File

View Memory Details


Register

Store multiple copies of owner


files

Login
Data Owner
MB-PMDDP scheme
Supports outsourcing of
dynamic
Purchase space

Provide proof for Cloud Server


owner files like many
Encrypt files & share copies are present in
decrypting keys with cloud server
user

Maintain records
Upload File about owner files
Authorized users
Register & Request data
View User detail File

Decrypt the file using


decryption key File

Access uploaded File

View uploaded files

Figure 3.2. Use case diagram for project requirements


The figure 3.2 shows all the use cases that are identified as part of this project. Using the
Identified use cases, the different actors and their operations are determined.

Dept of CSE, RVCE 2017-18 22


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

3.3.4 Flow-charts for different use cases

Figure 3.3. Flow chart for Data Owner

The figure 3.3 shows the flow chart for data owner module. Data owner is the owner of the
data that is stored on the cloud server. The data owner must register with the application and
then purchase the storage from the CSP for storing the files on cloud. Then the secret keys
need to be generated and the files need to be encrypted and uploaded on to the cloud server.
And then the data owner can share the secret key that is generated with the data user.

Dept of CSE, RVCE 2017-18 23


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Start

Data Owner Request


for Space

CSP Allocates Space


for the Request

Owner request to
upload the data files

Yes If uploading No
file Size <
allocated space

Check for IP

Allocate Space

Receive data from owner


and store the data Maximum Space
Requested

CSP stores the number of


copies of owner files

End

Figure 3.4. Flow chart for CSP

The figure 3.4 shows the flow chart of the cloud service provider module. This Provides the
interface for servicing of the storage allocation request raised by the data owner. Provides the
interface for receiving of data from the data owner and storing it on the storage. Provides the
interface for accepting the challenge vector from the data owner and responds with the meta
data that proves the data possession on the cloud storage. Provides interface for dynamic
updating of the data stored on the cloud storage.

Dept of CSE, RVCE 2017-18 24


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Start

User Register

Yes No
Login

View uploaded Username &


Files Password Wrong

Request CSP for


owner files
Log Out

Decrypt files using


decryption keys

Access data files

Figure 3.5. Flow chart for Data User


The figure 3.5 shows the flow chart of the data user module. This module provides the
registration and login functionalities for the data user. Provides the interface for viewing of
the information uploaded by the data owner using the shared key. Provides interface for
downloading the information

Dept of CSE, RVCE 2017-18 25


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

3.3.5 Software Environment


Java Technology
Java technology is both a programming language and a platform.

The Java Programming Language


The Java programming language is a high-level language that can be characterized by
all of the following buzzwords:
 Simple
 Architecture neutral
 Object oriented
 Portable
 Distributed
 High performance
 Interpreted
 Multithreaded
 Robust
 Dynamic
 Secure

With most programming languages, you either compile or interpret a program so that you can
run it on your computer. The Java programming language is unusual in that a program is both
compiled and interpreted. With the compiler, first you translate a program into an
intermediate language called Java byte codes —the platform-independent codes interpreted
by the interpreter on the Java platform. The interpreter parses and runs each Java byte code
instruction on the computer. Compilation happens just once; interpretation occurs each time
the program is executed. The following figure.3.6 illustrates how this works.

Dept of CSE, RVCE 2017-18 26


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Figure 3.6. Java Program compilation process.


You can think of Java byte codes as the machine code instructions for the Java Virtual
Machine (Java VM). Every Java interpreter, whether it’s a development tool or a Web
browser that can run applets, is an implementation of the Java VM. Java byte codes help make
“write once, run anywhere” possible. You can compile your program into byte codes on any
platform that has a Java compiler. The byte codes can then be run on any implementation of
the Java VM. That means that as long as a computer has a Java VM, the same program
written in the Java programming language can run on Windows 2000, a Solaris workstation,
or on an iMac.

Figure 3.7. Java Program portability across different OS.


The Java Platform
A platform is the hardware or software environment in which a program runs. We’ve already
mentioned some of the most popular platforms like Windows 2000, Linux, Solaris, and
MacOS. Most platforms can be described as a combination of the operating system and
hardware. The Java platform differs from most other platforms in that it’s a software-only
platform that runs on top of other hardware-based platforms.
The Java platform has two components:

Dept of CSE, RVCE 2017-18 27


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

The Java Virtual Machine (Java VM)


The Java Application Programming Interface (Java API)
You’ve already been introduced to the Java VM. It’s the base for the Java platform and is
ported onto various hardware-based platforms [12].
The Java API is a large collection of ready-made software components that provide many
useful capabilities, such as graphical user interface (GUI) widgets. The Java API is grouped
into libraries of related classes and interfaces; these libraries are known as packages. The next
section, What Can Java Technology Do? Highlights what functionality some of the packages
in the Java API provide.
The following figure depicts a program that’s running on the Java platform. As the figure
shows, the Java API and the virtual machine insulate the program from the hardware.

Figure 3.8. Java Virtual Machine.


Native code is code that after you compile it, the compiled code runs on a specific hardware
platform. As a platform-independent environment, the Java platform can be a bit slower than
native code. However, smart compilers, well-tuned interpreters, and just-in-time byte code
compilers can bring performance close to that of native code without threatening portability.

ODBC
Microsoft Open Database Connectivity (ODBC) is a standard programming interface for
application developers and database systems providers. Before ODBC became a de facto
standard for Windows programs to interface with database systems, programmers had to use
proprietary languages for each database they wanted to connect to. Now, ODBC has made the
choice of the database system almost irrelevant from a coding perspective, which is as it
should be. Application developers have much more important things to worry about than the
syntax that is needed to port their program from one database to another when business needs
suddenly change.

Dept of CSE, RVCE 2017-18 28


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Through the ODBC Administrator in Control Panel, you can specify the particular database
that is associated with a data source that an ODBC application program is written to use.
Think of an ODBC data source as a door with a name on it. Each door will lead you to a
particular database. For example, the data source named Sales Figures might be a SQL Server
database, whereas the Accounts Payable data source could refer to an Access database. The
physical database referred to by a data source can reside anywhere on the LAN.
The ODBC system files are not installed on your system by Windows 95. Rather, they are
installed when you setup a separate database application, such as SQL Server Client or Visual
Basic 4.0. When the ODBC icon is installed in Control Panel, it uses a file called
ODBCINST.DLL. It is also possible to administer your ODBC data sources through a stand-
alone program called ODBCADM.EXE. There is a 16-bit and a 32-bit version of this program
and each maintains a separate list of ODBC data sources.
From a programming perspective [13], the beauty of ODBC is that the application can be
written to use the same set of function calls to interface with any data source, regardless of the
database vendor. The source code of the application doesn’t change whether it talks to Oracle
or SQL Server. We only mention these two as an example. There are ODBC drivers available
for several dozen popular database systems. Even Excel spreadsheets and plain text files can
be turned into data sources. The operating system uses the Registry information written by
ODBC Administrator to determine which low-level ODBC drivers are needed to talk to the
data source (such as the interface to Oracle or SQL Server). The loading of the ODBC drivers
is transparent to the ODBC application program. In a client/server environment, the ODBC
API even handles many of the network issues for the application programmer.
The advantages of this scheme are so numerous that you are probably thinking there must be
some catch. The only disadvantage of ODBC is that it isn’t as efficient as talking directly to
the native database interface. ODBC has had many detractors make the charge that it is too
slow. Microsoft has always claimed that the critical factor in performance is the quality of the
driver software that is used. In our humble opinion, this is true. The availability of good
ODBC drivers has improved a great deal recently. And anyway, the criticism about
performance is somewhat analogous to those who said that compilers would never match the
speed of pure assembly language. Maybe not, but the compiler (or ODBC) gives you the

Dept of CSE, RVCE 2017-18 29


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

opportunity to write cleaner programs, which means you finish sooner. Meanwhile, computers
get faster every year.

JDBC
In an effort to set an independent database standard API for Java; Sun Microsystems
developed Java Database Connectivity, or JDBC. JDBC offers a generic SQL database access
mechanism that provides a consistent interface to a variety of RDBMSs. This consistent
interface is achieved through the use of “plug-in” database connectivity modules, or drivers. If
a database vendor wishes to have JDBC support, he or she must provide the driver for each
platform that the database and Java run on.
To gain a wider acceptance of JDBC, Sun based JDBC’s framework on ODBC. As you
discovered earlier in this chapter, ODBC has widespread support on a variety of platforms.
Basing JDBC on ODBC will allow vendors to bring JDBC drivers to market much faster than
developing a completely new connectivity solution.
JDBC was announced in March of 1996. It was released for a 90-day public review that ended
June 8, 1996. Because of user input, the final JDBC v1.0 specification was released soon
after.

Different Phases of Compilation


There are 5 phases for compiling a program
Phase 1: Edit
We create the program on editor, after that it stored in the disk with the name's ending .java
Phase 2: Compile javac (java compiler)
Compiler translates from high-level language program to byte codes and stores it in disk with
the ending name .class.
Phase 3: Load
Class loader compile read and put those byte codes from disk to Primary Memory.
Phase 4: Verify
Verify byte codes to confirm that all byte codes are valid and do not risk for the Java's
security restrictions.
Phase 5: Execute

Dept of CSE, RVCE 2017-18 30


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Java Virtual Machine (JVM) read and translates those byte codes to language that computer
can understand (Machine Language). Then execute the program, store it values in primary
memory.

Compiling and Running Java Program: It's Two Step Process


Develop Java Program
Compile Java Program from Command Prompt
Run Java Program from Command Prompt
Loading of Classes and Interfaces
Linking of Classes and Interfaces
Run Java Program - Initialization
Run Java Program - Invoking Class. main
JVM, JRE and JDK

Modern tool usage


The software team at Logic Mind Technologies has extensively worked on various flavors of
Unix and Windows based environment. Few of the Unix operating systems, which have been
used by the organization, are Sun Solaris & Linux.
The team at LMT has developed large scale and complex applications on Oracle, SQL Server
and DB2. There is also substantial working expertise on MySql and MSAccess.

The web server experience extends to the following:


Apache on Unix and NT Servers
IIS on NT Platforms
J2EE Compliant web servers (JSP version 1.10 & Java Servlet version 2.2)
Netscape Server
Oracle Web Server

Operating Systems : OS/400, UNIX, Windows 8, Windows 3.11/8.1/10


Database Environment : DB2, Oracle, SQL Server, MS Access, MySQL, CICS
Languages : Java, EJB, XML, RMI, WAP, C/C++, CL/400, RPG

Dept of CSE, RVCE 2017-18 31


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Web Enabled Systems : MS-IIS, Visual Interdev, Websphere, Weblogic


Front End Tools : Power Builder, VisualAge, Eclipse, NetBeans
Web Designing Tool : FrontPage 2000, Flash
Data and Object Modeling : Rational Rose

Our Team members have extensive knowledge in Oracle Products ranging from Oracle 7.3 to
10i, Developer 6i, Oracle 10iAS and other oracle products.

3.3.6 Feasibility study and Testing


The first and foremost strategy for development of a project starts from the thought of
designing a mail enabled platform for a small firm in which it is easy and convenient of
sending and receiving messages, there is a search engine, address book and also including
some entertaining games. When it is approved by the organization and our project guide the
first activity, i.e. preliminary investigation begins. The activity has three parts:
 Request Clarification
 Feasibility Study
 Request Approval

REQUEST CLARIFICATION
After the approval of the request to the organization and project guide, with an
investigation being considered, the project request must be examined to determine precisely
what the system requires. Here our project is basically meant for users within the company
whose systems can be interconnected by the Local Area Network(LAN). In today’s busy
schedule man need everything should be provided in a readymade manner. So taking into
consideration of the vastly use of the net in day-to-day life, the corresponding development of
the portal came into existence.

FEASIBILITY ANALYSIS
An important outcome of preliminary investigation is the determination that the system
request is feasible. This is possible only if it is feasible within limited resource and time.

Dept of CSE, RVCE 2017-18 32


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

The different feasibilities that have to be analyzed are


 Operational Feasibility
 Economic Feasibility
 Technical Feasibility

Operational Feasibility
Operational Feasibility deals with the study of prospects of the system to be developed. This
system operationally eliminates all the tensions of the Admin and helps him in effectively
tracking the project progress. This kind of automation will surely reduce the time and energy,
which previously consumed in manual work. Based on the study, the system is proved to be
operationally feasible.

Economic Feasibility
Economic Feasibility or Cost-benefit is an assessment of the economic justification for a
computer-based project. As hardware was installed from the beginning & for lots of purposes
thus the cost on project of hardware is low. Since the system is a network based, any number
of employees connected to the LAN within that organization can use this tool from at
anytime. The Virtual Private Network is to be developed using the existing resources of the
organization. So the project is economically feasible.

Technical Feasibility
According to Roger S. Pressman, Technical Feasibility is the assessment of the technical
resources of the organization. The organization needs IBM compatible machines with a
graphical web browser connected to the Internet and Intranet. The system is developed for
platform Independent environment. Java Server Pages, JavaScript, HTML, SQL server and
Web Logic Server are used to develop the system. The technical feasibility has been carried
out. The system is technically feasible for development and can be developed with the
existing facility.
REQUEST APPROVAL
Not all request projects are desirable or feasible. Some organization receives so many
project requests from client users that only few of them are pursued. However, those projects

Dept of CSE, RVCE 2017-18 33


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

that are both feasible and desirable should be put into schedule. After a project request is
approved, it cost, priority, completion time and personnel requirement is estimated and used
to determine where to add it to any project list. Truly speaking, the approval of those above
factors, development works can be launched.

SYSTEM DESIGN AND DEVELOPMENT


INPUT DESIGN
Input Design plays a vital role in the life cycle of software development, it requires very
careful attention of developers. The input design is to feed data to the application as accurate
as possible. So inputs are supposed to be designed effectively so that the errors occurring
while feeding are minimized. According to Software Engineering Concepts, the input forms
or screens are designed to provide to have a validation control over the input limit, range and
other related validations.

This system has input screens in almost all the modules. Error messages are developed to alert
the user whenever he commits some mistakes and guides him in the right way so that invalid
entries are not made. Let us see deeply about this under module design.

Input design is the process of converting the user created input into a computer-based format.
The goal of the input design is to make the data entry logical and free from errors. The error is
in the input are controlled by the input design. The application has been developed in user-
friendly manner. The forms have been designed in such a way during the processing the
cursor is placed in the position where must be entered. The user is also provided with in an
option to select an appropriate input from various alternatives related to the field in certain
cases. Validations are required for each data entered. Whenever a user enters an erroneous
data, error message is displayed and the user can move on to the subsequent pages after
completing all the entries in the current page.

OUTPUT DESIGN
The Output from the computer is required to mainly create an efficient method of
communication within the company primarily between the project leader and his team

Dept of CSE, RVCE 2017-18 34


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

members, in other words, the administrator and the clients. The output of VPN is the system
which allows the project leader to manage his clients in terms of creating new clients and
assigning new projects to them, maintaining a record of the project validity and providing
folder level access to each client on the user side depending on the projects allotted to him.
After completion of a project, a new project may be assigned to the client. User authentication
procedures are maintained at the initial stages itself. A new user may be created by the
administrator himself or a user can himself register as a new user but the task of assigning
projects and validating a new user rests with the administrator only.

The application starts running when it is executed for the first time. The server has to be
started and then the internet explorer in used as the browser. The project will run on the local
area network so the server machine will serve as the administrator while the other connected
systems can act as the clients. The developed system is highly user friendly and can be easily
understood by anyone using it even for the first time.

SYSTEM TESTING
TESTING METHODOLOGIES
The following are the Testing Methodologies:
 Unit Testing.
 Integration Testing.
 User Acceptance Testing.
 Output Testing.
 Validation Testing.

Unit Testing
Unit testing focuses verification effort on the smallest unit of Software design that is
the module. Unit testing exercises specific paths in a module’s control structure to
ensure complete coverage and maximum error detection. This test focuses on each module
individually, ensuring that it functions properly as a unit. Hence, the naming is Unit Testing.

Dept of CSE, RVCE 2017-18 35


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

During this testing, each module is tested individually and the module interfaces are
verified for the consistency with design specification. All important processing path are tested
for the expected results. All error handling paths are also tested.
Integration Testing
Integration testing addresses the issues associated with the dual problems of
verification and program construction. After the software has been integrated a set of high
order tests are conducted. The main objective in this testing process is to take unit tested
modules and builds a program structure that has been dictated by design.

The following are the types of Integration Testing:


Top Down Integration
This method is an incremental approach to the construction of program structure.
Modules are integrated by moving downward through the control hierarchy, beginning with
the main program module. The module subordinates to the main program module are
incorporated into the structure in either a depth first or breadth first manner.
In this method, the software is tested from main module and individual stubs are
replaced when the test proceeds downwards.

Bottom-up Integration
This method begins the construction and testing with the modules at the lowest level
in the program structure. Since the modules are integrated from the bottom up, processing
required for modules subordinate to a given level is always available and the need for stubs is
eliminated. The bottom up integration strategy may be implemented with the following steps:

 The low-level modules are combined into clusters into clusters that perform a specific
Software sub-function.
 A driver (i.e.) the control program for testing is written to coordinate testcase input
and output.
 The cluster is tested.
 Drivers are removed and clusters are combined moving upward in the program
structure

Dept of CSE, RVCE 2017-18 36


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

The bottom up approaches tests each module individually and then each module is module is
integrated with a main module and tested for functionality.

User Acceptance Testing


User Acceptance of a system is the key factor for the success of any system. The
system under consideration is tested for user acceptance by constantly keeping in touch with
the prospective system users at the time of developing and making changes wherever
required. The system developed provides a friendly user interface that can easily be
understood even by a person who is new to the system.

Output Testing
After performing the validation testing, the next step is output testing of the proposed
system, since no system could be useful if it does not produce the required output in the
specified format. Asking the users about the format required by them tests the outputs
generated or displayed by the system under consideration. Hence the output format is
considered in 2 ways – one is on screen and another in printed format.

Validation Checking
Validation checks are performed on the following fields.

Text Field:
The text field can contain only the number of characters lesser than or equal to its size. The
text fields are alphanumeric in some tables and alphabetic in other tables. Incorrect entry
always flashes and error message.
Numeric Field:
The numeric field can contain only numbers from 0 to 9. An entry of any character
flashes an error messages. The individual modules are checked for accuracy and what it has to
perform. Each module is subjected to test run along with sample data. The individually
tested modules are integrated into a single system. Testing involves executing the real data
information is used in the program the existence of any program defect is inferred from the
output. The testing should be planned so that all the requirements are individually tested.

Dept of CSE, RVCE 2017-18 37


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

A successful test is one that gives out the defects for the inappropriate data and
produces and output revealing the errors in the system.

Preparation of Test Data


Taking various kinds of test data does the above testing. Preparation of test data plays
a vital role in the system testing. After preparing the test data the system under study is tested
using that test data. While testing the system by using test data errors are again uncovered and
corrected by using above testing steps and corrections are also noted for future use.

Using Live Test Data:


Live test data are those that are actually extracted from organization files. After a
system is partially constructed, programmers or analysts often ask users to key in a set of data
from their normal activities. Then, the systems person uses this data as a way to partially test
the system. In other instances, programmers or analysts extract a set of live data from the files
and have them entered themselves.
It is difficult to obtain live data in sufficient amounts to conduct extensive testing.
And, although it is realistic data that will show how the system will perform for the typical
processing requirement, assuming that the live data entered are in fact typical, such data
generally will not test all combinations or formats that can enter the system. This bias toward
typical values then does not provide a true systems test and in fact ignores the cases most
likely to cause system failure.

Using Artificial Test Data:


Artificial test data are created solely for test purposes, since they can be generated to
test all combinations of formats and values. In other words, the artificial data, which can
quickly be prepared by a data generating utility program in the information systems
department, make possible the testing of all login and control paths through the program.
The most effective test programs use artificial test data generated by persons other
than those who wrote the programs. Often, an independent team of testers formulates a testing
plan, using the systems specifications. The package “Virtual Private Network” has satisfied
all the requirements specified as per software requirement specification and was accepted.

Dept of CSE, RVCE 2017-18 38


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

USER TRAINING
Whenever a new system is developed, user training is required to educate them about
the working of the system so that it can be put to efficient use by those for whom the system
has been primarily designed. For this purpose, the normal working of the project was
demonstrated to the prospective users. Its working is easily understandable and since the
expected users are people who have good knowledge of computers, the use of this system is
very easy.

MAINTAINENCE
This covers a wide range of activities including correcting code and design errors. To
reduce the need for maintenance in the long run, we have more accurately defined the user’s
requirements during the process of system development. Depending on the requirements, this
system has been developed to satisfy the needs to the largest possible extent. With
development in technology, it may be possible to add many more features based on the
requirements in future. The coding and designing is simple and easy to understand which will
make maintenance easier.

TESTING STRATEGY:
A strategy for system testing integrates system test cases and design techniques into a well-
planned series of steps that results in the successful construction of software. The testing
strategy must co-operate test planning, test case design, test execution, and the resultant data
collection and evaluation. A strategy for software testing must accommodate low-level tests
that are necessary to verify that a small source code segment has been correctly
implemented as well as high level tests that validate major system functions against user
requirements.

Software testing is a critical element of software quality assurance and represents the
ultimate review of specification design and coding. Testing represents an interesting anomaly
for the software. Thus, a series of testing are performed for the proposed system before the
system is ready for user acceptance testing.

Dept of CSE, RVCE 2017-18 39


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

SYSTEM TESTING:
Software once validated must be combined with other system elements (e.g. Hardware,
people, database). System testing verifies that all the elements are proper and that overall
system function performance is
achieved. It also tests to find discrepancies between the system and its original objective,
current specifications and system documentation.

UNIT TESTING:
In unit testing different are modules are tested against the specifications produced during the
design for the modules. Unit testing is essential for verification of the code produced during
the coding phase, and hence the goals to test the internal logic of the modules. Using the
detailed design description as a guide, important Conrail paths are tested to uncover errors
within the boundary of the modules. This testing is carried out during the programming stage
itself. In this type of testing step, each module was found to be working satisfactorily as
regards to the expected output from the module.
In Due Course, latest technology advancements will be taken into consideration. As
part of technical build-up many components of the networking system will be generic in
nature so that future projects can either use or interact with this. The future holds a lot to
offer to the development and refinement of this project.

Integration testing
Integration tests are designed to test integrated software components to determine if
they actually run as one program. Testing is event driven and is more concerned with the
basic outcome of screens or fields. Integration tests demonstrate that although the components
were individually satisfaction, as shown by successfully unit testing, the combination of
components is correct and consistent. Integration testing is specifically aimed at exposing the
problems that arise from the combination of components.

System Test
System testing ensures that the entire integrated software system meets requirements. It
tests a configuration to ensure known and predictable results. An example of system testing is

Dept of CSE, RVCE 2017-18 40


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

the configuration oriented system integration test. System testing is based on process
descriptions and flows, emphasizing pre-driven process links and integration points.

White Box Testing


White Box Testing is a testing in which in which the software tester has knowledge of
the inner workings, structure and language of the software, or at least its purpose. It is
purpose. It is used to test areas that cannot be reached from a black box level.

Black Box Testing


Black Box Testing is testing the software without any knowledge of the inner workings,
structure or language of the module being tested. Black box tests, as most other kinds of tests,
must be written from a definitive source document, such as specification or requirements
document, such as specification or requirements document. It is a testing in which the
software under test is treated, as a black box. you cannot “see” into it. The test provides inputs
and responds to outputs without considering how the software works .

Dept of CSE, RVCE 2017-18 41


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

3.3.7 Sequence diagram

Data Owner CSP Cloud Server Data User

Register Owner info

Registration Confirmation Stores Data owner info

Login

Purchase Space

Space allocated Confirmation

Data encryption

Upload data Files

File Upload Confirmation

Store multiple copies of owner files Data

Register

Stores authorized users’ info

Registration Confirmation

Data File Request

Share Decryption keys with user Data decryption

Data File Request Confirmation

Access Information

View uploaded files

Delete files

Deletion Confirmation

Figure 3.9. Sequence diagram of data owner.


The figure 3.9 is the sequence diagram of the data owner module. The above diagram
provides the interaction of the data owner module with other modules of the application.

Dept of CSE, RVCE 2017-18 42


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

3.3.8 Screen shots

Figure 3.10. Main Page


The figure 3.10 shows the snapshot of the Main page of the application.

Figure 3.11. Global User Operations Frame


The figure 3.11 provides the snapshot of Global User operations frame of the application.

Dept of CSE, RVCE 2017-18 43


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Figure 3.12. Different User Registration /Login Frames


The figure 3.12 provides the snapshot of different type of user’s registration and login dialogs.

Figure 3.13. Data Owner Registration/Login panel


The figure 3.13 provides the data owner module’s registration /login

Dept of CSE, RVCE 2017-18 44


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Figure 3.14. Selection of Cloud Server.


The figure 3.14 provides snapshot of the selection of cloud controller IP.

Figure 3.15. Data Owner Main Frame


The figure 3.15 provides snapshot of Data owner main frame. The frame has all the operations
that can be performed by the data owner.

Dept of CSE, RVCE 2017-18 45


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Figure 3.16. Selecting files from disk for upload


The figure 3.16 provides snapshot of the browsing the local disk and selection of file for
uploading to cloud.

Figure 3.17. Viewing Files.

The figure 3.17 provides snapshot of the viewing of file that is encrypted upon selection of
file before uploading to cloud.

Dept of CSE, RVCE 2017-18 46


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

3.4 Background study of Technical reports, books, journal papers


The following Table.1 contains the list of materials used to enrich my knowledge to
accomplish the assigned tasks.
SI.
Material/Source Description
No
PAPERS
“Provable data possession at untrusted
This paper explains the provable data
stores” By Giuseppe Ateniese,Randal
possessions technique that can be
1 Burns,Reza Curtmola,Joseph
used for trusting the data on an
Herring,Lea Kissner,Zachary
untrusted server.
Peterson,Dawn Song
This paper explains the provable data
“Publicly verifiable remote data
2 possession of multiple copies on
integrity”, By K. Zeng,
cloud servers.

“Demonstrating data possession and The paper describes about


cryptographic protocols that uses
3 uncheatable data transfer” By D. L. G.
hash functions for verifying the
Filho and P. S. L. M. Barreto integrity of data

“Provable Multicopy Dynamic Data This paper discusses about the map
4 Possession in Cloud Computing Systems” based provable data possessions of
By Ayad F. Barsoum and M. Anwar Hasan multiple copies of dynamic data.
An approach to problem analysis is
“Problem Analysis and Structure” By presented in which problems are
5 Michael JACKSON decomposed into subproblems of
recognised classes.

This extract from IRM’s training


material looks at how a structured
“Problem Analysis Techniques” By
6 approach to defining and analysing
Derrick Brown and Jan Kusiak
problems can be used as the basis for
designing better solutions.

Dept of CSE, RVCE 2017-18 47


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

This paper presents a process by


which the identified problem and its
“Problem Identification and
characteristics are decomposed and
Decomposition within the
translated into a set of user needs that
7 Requirements Generation Process” By
provide the basis for the solution
Ahmed S. Sidky, Rajat R. Sud, Shishir
description, i.e, the set of
Bhatia and James D. Arthur
requirements.

Journals
OPENDATACENTERALLIANCE This document offers a perspective
BestPractices: Architecting Cloud- for developers focused on advancing
1
Aware Applications Rev. 1.0 application architectures and
development practices.
This paper describes vendor neutral
Cloud Customer Architecture for
best practices for hosting the services
2 Mobile
and components required to support
mobile apps using cloud computing.
This white paper explains how
CREATING APPLICATIONS
Software AG’s Digital Business
3 FOR DIGITAL TRANSFORMATION
Platform provides the foundation for
your next-generation applications.
Table 3.1. List of references.

3.5 Participation in organizational meetings


Co-ordination with the team has great impact on the successful task completion in the
expected time period. As a new member of the team, one requires a lot of co-ordination in all
respect among the team members.
Communication is the purposeful activity of information exchange between two or more
participants. Good communication skills are very essential in professional life. The ability to
communicate the information clearly, accurately and as intended is very essential. An
effective communication skill is characterized by a number of aspects like: the vocabulary and

Dept of CSE, RVCE 2017-18 48


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

language used should be tailored to the audience; maintaining good eye contact; body
language; presenting the ideas appropriately and precisely.
Following are some of the specific non-technical skills, which helped to improve co-
ordination and communication skills.
Participation in Meeting
There will be periodic meetings like daily or weekly, based on the need for good progress.
The participation in these meetings helps to align the individual work. Also, it helps to
prioritize the requirements. The discussions during these meeting will give a clear idea about
the end results expected. The obstacles of work progress can be shared and resolved through
such discussions.
Verbal and Written Communication
Even though it is infrequent that one presents his/her ideas to others, there will probably times
when one needs to present information to a group of people. As a part of industrial training
there were weekly presentations on the researches carried out in HP R&D. These
presentations helped to improve verbal communication as well as written communication.
Another type of written communications to mention is: e-mail writing, which is necessary to
communicate with managers and team members. Also, Lync messenger is used to
communicate and resolve the technical problems that arise. To use the various communication

tools, one requires the precise written communication skill .

3.6 Time management


Time is the most valuable resource in project management. Every deliverable during project
development is time-bound. In-efficient time management will lead project towards failure. It
is the responsibility of manager to prioritize and schedule the tasks for the team. Certain
milestones and reviews will be planned, according to which the team has to align its work and
deliver the modules.
The login and logout time of each employee is managed through Radio Frequency
Identification (RFID) enabled Identity cards. Following are the steps followed for time
management:
 Defining Activities
 Sequencing Activities

Dept of CSE, RVCE 2017-18 49


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

 Resource estimation for activities


 Duration and Effort Estimation
 Schedule Preparation
 Schedule Control

3.7 Resource utilization


There resource could be company’s products like PC’s, Laptops, and Headphones etc.
Travelling facility, cafeteria, gym and in-house primary health-care center, stationary shop,
Xerox and printing center, library, reception and ATM machines are some other resources in
the form of facilities.
Projectors and conference rooms in every floor helps for periodic meetings. Online Skype and
telephone facility helps to connect to people. The major role in managing all these resources
will be the Human Resource Management department in collaboration with Admins of each
wing.

3.8 Interpersonal skills and initiatives taken


These are the skills used when one engages in face-to-face communication with one or more
other people. Good listening is important in improving interpersonal skill. It enables us to
work more effectively in team. It also improves problem solving and decision-making skills.
Interactions with mentors, manager and other interns in the team improved the interpersonal
skills.

Dept of CSE, RVCE 2017-18 50


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

Chapter 4

Reflection Notes
The following section describes experience regarding your internship and assessment, list of
specific technical outcomes of the internship and list of specific non-technical outcomes.

4.1 Experience regarding your internship and assessment


The entire journey of sixteen weeks exposed to different dimensions of IT profession
such as problem identification, analysis, design, self-learning, development using appropriate
technology, use of modern tools and communication skills etc.
This experience bridged the gap between the academics and the actual practices used
in industry. Also, it helped to gauge the knowledge and skills so far acquired. Further, this
opportunity motivated to gain the skills and technical concepts to align with the current trends
and expectations in industry.

4.2 List of specific technical outcomes of the internship


Following are the important technical outcomes of the internship:
 Usage of modern tools for development [14].
 Deciding on the appropriate technology to implement the solution designed for the
problem identified.
 Learnt the design and organization of different modules of operating system.
 Acquired considerable knowledge on compilers.
 Importance of using open-source tools [14].
 Learnt the art of fixing the technical issues during development process.

Dept of CSE, RVCE 2017-18 51


Verifiable Multi-Copy Dynamic Data Tenancy in Public Cloud

4.3 List of specific non-technical outcomes


The overall communication skills such as: verbal communication, written
communication, presentation skills, and inter-personal skills were improved.

Improvement in Communication Skills


The weekly presentations had great impact on improvising communication skills.
Also, the participation in team meetings helped to give precise opinions on the ongoing
discussion. It also helped to improve the listening and analysis ability.

Personality Development
The different tasks assigned and the strategy to accomplish those tasks had a good
impact on overall personality development. The technical skills acquired helped to choose an
appropriate resource to refer for task completion. Further, the non-technical skills acquired
helped to exchange the ideas and resolve the technical issues during development.

Time Management
The tasks assigned during industrial training were planned according to the priorities
and dependencies. This planning was solely dependent on the intern based on the interactions
with mentors and manager. Time is one of the important resources to be managed properly to
excel in the work environment and deliver the time-bound tasks to co-ordinate with the team.
Time management is also a skill, which can be improved through experience.

Resource Utilization Skills


The resources such as the Personal Computer, Printers and Software tools required to
carry out the tasks assigned is also one of the skills. Organization should have the appropriate
strategies to ensure that the right resources are working on the right projects, based on real-
time project timelines as the project evolves.
LMT employees can use the internal portal to request for the resources that are necessary to
accomplish the given task. Every request will be monitored systematically by raising the

tickets. This ticket is used as reference for communication till the end of serving that request .

Dept of CSE, RVCE 2017-18 52

Vous aimerez peut-être aussi