Académique Documents
Professionnel Documents
Culture Documents
INTRODUCTION
The Cloud computing has turned into evolving technology solution for
individual and organization use. It deals a novel way to convey diverse services,
however, considerably altering the cost structure essential for those services. This
novel technique and its estimating structure vary in the approach, the business
operates. It embraces the structures of traditional computing technology alike a
distributed, parallel and grid computing and so on. Numerous people and
organizations are using the diverse cloud services through the internet. It entails of
diverse services such as Infrastructure as a Service (IaaS), Platform as a Service
(PaaS) and Software as a Service (SaaS).
1
Further, in 1990’s, the telecommunication industries that earlier offered point-
to-point data communication started contributing a virtual private network with lower
cost and better quality of service. With the introduction of high power computing
system into the market, the researchers and scientists turn into utilizing this system
through the time sharing system. They investigated with several algorithms to
enhance the infrastructure, platform, and applications to prioritize CPUs and improve
the efficiency for end users.
Later in 2000, the cloud computing technology has risen into reality. In early
2008, the open source software called OpenNebulla tool was introduced, which was
used for setting up private and hybrid clouds. In 2008 mid, Gartner created a prospect
sectors
for increased relationship among customers of different IT and found that
enterprises are moving from company-owned infrastructure to per usage based service
models. The evaluation continues yet and still plenty of issues found and researchers
projected towards finding the solution to overcome these issues.
1.3.2 Broad network access: All the resources such as tablets, PC’s and smartphones
are accessed through the internetwork technologies that promote the use of
heterogeneous client devices to be integrated into the network. These resources are
also manageable from a varied choice of locations that deal online network access.
Enterprises that have broad network access inside a cloud network required to deal
with certain security concerns that arise. It's a disputed topic because it touches at the
soul of the alteration among private and public cloud computing. Repeatedly,
enterprises select private cloud service as they are worried about the possibility of
information leakages over the gaps left open to external networks in a public cloud.
1.3.3 Resource pooling: The cloud service provider’s computing infrastructures are
dynamically pooled to work for several consumers by means of a multi-tenant model,
2
with diverse resources either it is physical or virtual resource allocated dynamically
and reallocated as per the consumer needs. The types of services that can relate to a
resource pooling approach consist of data storage, processing and bandwidth services.
It’s very difficult to provide the dynamic computing resource to efficiently provide
the provision and de-provision of resources for the customers without tolerating the
security and reliability of the system.
1.3.5 Measured service: Cloud service provider can control and enhance consumer
resource use by enhancing metering service at a certain degree of abstraction suitable
to the specific type of services such as computing, bandwidth, and storage. However,
building a system capable of monitoring the utilization of resources and creating a
granular report is still being a high order issue.
An IaaS model affords the ability to provision the computing and storage
resources on demand by cloud consumers. The consumer gets the capability to deploy
and run the software which comprises an operating system and other applications. The
consumer owns the responsibility to manage underlying OS, developed applications,
data storage and some selected network resources; however they cannot control the
cloud infrastructure. Cloud service provider bill IaaS consumers depend on a number
3
of resources allocated and consumed by them. The resources assigned are virtualized
resources, which needs to be properly managed. The vulnerabilities related to
virtualization techniques and risks will affect the IaaS model.
The PaaS model affords the application development platform and solutions
stack as a service to the consumers. The consumers are capable of developing their
applications without purchasing and managing the hardware and software essential for
their application development. The entire life cycle support for providing applications
and services are delivered by the PaaS model. The consumer ensures control over the
deployed applications and application hosting environment configurations, though
they cannot manage or control the underlying cloud infrastructure, network, servers,
operating systems, or storage. The authentication and authorization issues, data
storage security are the main important security considerations for PaaS model.
The SaaS model affords the cloud consumers to access the applications from
cloud service providers. This eliminates the cloud users to install and maintain the
application that runs on their own system. The applications are mostly accessed by
users using thin clients via the web browser. The consumer has control only over their
application configuration settings. The required cloud resources should be maintained
and controlled by the cloud service providers. The various business applications used
for accounting, invoicing, collaborations and employee management make use of
SaaS delivery model. Due to this, more focus needs to be shown for access control
and identity management used in enterprise applications deployed in the cloud. The
consumers in SaaS model are charged based on the usage of monthly or yearly basis.
These cloud services can be accessed using any cloud clients that are connected to the
internet. Regardless of the service models, the four deployment models existing in the
cloud computing are private cloud, public cloud, hybrid cloud and community cloud.
Each of these models has its unique features and characteristics that meet the cloud
user’s specific requirement.
4
security policies, standards, and regulatory compliance. The corporates like HP, IBM,
Cisco, VMware, and EMC are the significant players of the private cloud model.
In the public cloud deployment model, the cloud resources like several
applications, storage, and computing resources are offered by public vendors for large
corporate customers and for individual users. These kinds of resources are managed
by third-party cloud service provider who is in-charge for the public cloud services
offering. Henceforth, the consumer of public cloud gets less control over the physical
and logical security aspects of public cloud resources. The Amazon Web Services,
Microsoft Azure, and IBM’s Blue Cloud are examples of the public cloud service
provider. Velte et al., (2009).
In the community cloud, the cloud resources are shared by the multiple
organizations whose nature of work is same for all the organization's user community
with common concerns like security policy and compliance requirements. The cloud
resources are managed by the organization or by a third party vendor. It offers the
benefits of the private cloud, without its high investment costs.
In the hybrid cloud deployment model, the cloud resources are a combination
of private, public and community cloud. This model offers the prospect to store
sensitive information in a private cloud and non-sensitive information in a public
cloud. It helps to provide wavering levels of security, control, and scalability support
to the cloud consumers.
5
Figure 1.1 A cloud model represents three cloud services and four deployment
models
Figure 1.1 shows the cloud computing model with three cloud delivery services and
four deployment models used in the cloud. Services offered over the different cloud
models are still growing and obstacles are being overwhelmed. In spite of the diverse
services offered by the cloud in different deployment model, the data storage in cloud
stands out amongst the most fundamental services offered by numerous cloud service
providers. Though, such service providers cannot be trusted to ensure the
confidentiality of the organizations and individual user’s data.
6
Figure 1.2 Security concern at the various levels cloud computing model
Figure 1.2 indicates the security in depth at the different levels of cloud computing
Fortiş et al.,(2015). This layered approach provides the way to increase the
survivability of a cloud environment in the occurrence of a various attack.
With respect to the network level security, the threats are more in the case of
public clouds than the private clouds. Since private cloud resources present within the
organization limits, the customer has more control over the cloud resources. However,
in the public clouds, ensuring the appropriate access control, ensuring confidentiality
and integrity of the consumer data in transit, ensuring internet resources available is
the most important threats that need to be focused on safeguarding the network level
security. Subashini and Kavitha (2011).
7
1.7 HOST LEVEL SECURITY THREATS AND CHALLENGES
Host level security concerns are those that affect the host resources when it is
associating itself to the cloud environment. Security issues at host level can be
considered in the perspective of diverse service delivery models and deployment
models. Mather et al., (2009)
Threats and challenges that are precise to a cloud environment at the host level
are closely related to virtualization vulnerabilities such as VM escape, hypervisor
threats triggered in a public cloud environment.
Most of the organization and academia customers are keen to deploy their
applications to a cloud model in order to save money and to raise efficiency and
reliability of their applications.
Even though, due to the inefficient access control over the networking
resources with servers, audit logs access, patch management makes the cloud
applications are further vulnerable to the numerous security threats. A web-based
application developed and deployed in the private cloud must be secured from the
outside hackers by implementing appropriate access control of the network and host
level. Web-based application deployed in a public cloud must be intended to use
secure Software Development Life Cycle (SDLC) and need to assure that API’s have
been carefully verified for security.
All types of service delivery models require the security at the data level.
Several aspects of data security embrace data-in-transit, data-at-rest, data process,
data lineage, data provenance and data remanence. Mather et al., (2009)
8
The data stored on storage medium consider as “data-at-rest”. This data can be
protected by using highly secure encryption methods. Even though, in the cloud
computing environment, the data encryption during the “data-at-rest” for
applications hosted in the cloud are not possible since encryption might thwart
indexing and data search.
Even though if the data is being encrypted during the transmission and at rest
in the cloud service provider database, it must be decrypted beforehand it is
processed. Though the algorithms such as homomorphic encryption are designed to
support computation within cipher-text itself, it will reduce the system performance
due to the computation complexity of that algorithm.
Data lineage is a technique of data path tracing in order to see what time and
where the data is placed in cloud service provider locations and it is essential for data
audit. Even though, finding the precise data path is not actually possible in a public
cloud.
Data provenance is a technique of verifying the integrity of data and certifying
the computation data accuracy. Data provenance is difficult with shared resources that
are used in a cloud environment by several users.
Data remanence is the residual illustration of the data that presents even
afterward the effort made to delete that data. This residue happens when data is left in
normal file deletion or with the duplicate copy that resides in another location server.
This may leak the sensitive data to the illegal user. Bloomberg (2011)
Bring Your Own Device, or BYOD is mainstream procedures to update
business today and take into account an adaptable, adaptable venture, and I have
comprehended the significance of utilizing this innovation successfully yet safely.
The most recent disclosure with respect to Dropbox (n.d.) or comparable projects and
BYOD is that documents erased from cell phones aren't generally genuinely gone.
Analysts found that records, sound documents, pictures, and more had the capacity
recuperated despite the fact that they were thought to be for all time erased both from
the gadget and from the cloud. Another vexing revelation was that metadata, for
example, client action history could likewise be found with a bit of burrowing.
Notwithstanding these discoveries, there have dependably been issues of
programmers, and the way that consumer information is put away on a mutual server
with different clients utilizing these "safe cloud storage" organizations. Encryption is
their response to this charge, yet even that isn't trick verification. There is just an
9
excess of chances that consumer information will spill when they run with one of
these open server storage cloud choices.
With regards to utilizing a cloud storage administration, the consumers have
no influence over where the cloud service providers are putting away their
information. They claim the servers and they will disperse the consumers’ documents
anyway, it is helpful, which can be a major security danger.
Another issue which has as of late been found with a portion of the prevalent
cloud storage organizations is a blemish in encryption insurance. Numerous
organizations like to utilize the cloud for secure storage as well as a sheltered sharing
and coordinated effort technique. It was found, on the other hand, that when
information is shared between two or more clients in the cloud, it is powerless against
assault by workers of the cloud storage organization itself. They can utilize a fake key
to open the information when it is sent for sharing and view it before re-encoding it
and sending it to the planned viewer. While no real examples of this have been found
starting yet, the likelihood is failing. Secure cloud storage organizations can no more
brag of a "zero learning environment" for consumer data. The consumers need to
assume that the cloud administration won't look into their documents. This is simply
one more inborn threat of open server cloud storage.
10
The cloud framework related to data access as shown in figure 3 are mainly
concerned about security related issues such as confidentiality, integrity, and
availability of the consumer data.
To ensure confidentiality of the data, the exploiting edge of the guard for any
cloud framework is encryption. Encryption routines use complex calculations to
disguise cloud-ensured data. However, applying encryption routines is not an
appropriate solution to impose organization complex hierarchical structure. Pearson,
(2009).
The same way, ensuring the integrity of the outsourced data in the cloud can
be accomplished with the implementation of a suitable data auditing framework using
trusted third party entity.
11
1.10.1 Mandatory Access Control (MAC)
Top secret is the uppermost secret level of classified information and secret
information may cause “serious destruction” to national security if it were publicly
available. Confidential document would cause destruction or be damaging to national
security if it were publicly available.
Based on the Table 1.1, a Multi-Level Security (MLS) system allows a subject
to access an object if subject classification is greater or equal to the object
classification.
For example, a subject with Secret classification is capable of reading and
writing Unclassified, Confidential, and Secret documents, but not the Top Secret
documents.
12
1.10.2 Discretionary Access Control
Table 1.2 An access control matrix denotes sample files in a Linux system.
As shown in eq. 1.1, if we consider Pso⊆ OP denotes the set of permission the
subject has over an object o ∈ O and subject s ∈ S, i.e. S consider to be a set of
subjects and O consider to be a set of objects
ACM = (Pso⊆ OP) s ∈ S, o ∈ O, Pso⊆ OP 1.1
Even though the Access Control Matrix considers being a good choice, its
precise implementations require extreme memory storage. The Access Control List
and Capability based approach is the two practical implementations using an access
control matrix.
The access control list is a method which considers the access control matrix
in column perspective. It specifies the set of operations that are allowed to be
performed on a particular object by the different subjects. Each ACL entry states a
subject and an operation for the object.
13
For e.g. the set of operations for the file1 object are denoted as shown below.
ACLF1 = (John: read; Steve: read, write, execute; Bob: read, execute) 1.2
While looking into the Eq. 1.2, we can easily identify the set of operations
permitted for the subjects {John, Steve, Bob} over the File1 object. However,
identifying the set of objects in which the particular subject, such as John or Steve is
permitted to perform the various operations is difficult.
ACL’s are practically implemented in UNIX file system, Solaris and
Microsoft Windows NT and Mac OS operating systems, network resources such as
routers, switches, and firewalls to protect the illegal user’s access and hosted in SQL
relational database systems. Amazon cloud uses the ACL based access control
system.
Each process conveys a capability list when it attempts to access an object; the
access control system verifies this list to check whether the process has the right
capability. A capability is usually implemented as a privileged data structure that
consists of a section that decides access rights, and a section that exclusively
discovers the object to be accessed. Capabilities are usually stored by the operating
system in a list, with some mechanism in place to prevent the program from directly
modifying the contents of the capability.
In the previous approach such as MAC and DAC systems, the new subjects
added to and revoked from the access control policy often. The new user is given a
security classification which grants access to certain objects in a MAC system.
14
Whereas in a DAC system, based on its type as ACLs or capabilities, the new subject
should be either added to or revoked from all ACL’s or should be delivered
capabilities to all relevant objects. Bammigatti, (2008).
But generally, in many cases, the new subjects are considered to be a new role.
For e.g. in an organization whenever the new employees are joined, according to their
designation, they will be added to the specific role. It doesn’t require changing the
access permission of any roles directly. In case if the role permission changed, it will
affect all the members included in that role. Because of this RBAC consider being the
alternative model for DAC or MAC based system. Na and Cheon, (2000). The
Microsoft Azure cloud is working based on Role based Access Control method.
Three primary rules are well-defined for RBAC:
1. Role assignment: A subject can use access permission only if it has selected or
been allotted a role.
2. Role authorization: Each subject must be authorized with particular role
considered to be an active role.
3. Permission authorization: Permission is granted to the subject only for its
active role.
The RBAC role assignment can take place according to the organization needs,
by default the higher role can be granted permission owned by its sub-role.
A subject can have several roles. A role can have several subjects and several
permissions. Permission can be allotted to many roles. An operation can be allotted
several permissions. Permission can be allotted to several operations.
15
A subject may have various simultaneous sessions with distinct permissions.
Apart from this, another access control model which is popular as ABAC
called as Attribute-Based Access Control. This model is most suggested access
control model for sharing information among diverse and different organizations.
The policies that can be applied to an ABAC model are restricted to the degree
enforced by the computational language. This flexibility empowers the subjects to
access the objects without identifying individual relations among each subject and
each object. For e.g. a subject is given a set of attributes when they joined an
organization. An object is given its object attributes upon creation. (e.g. A subject is a
John smith is a junior nurse in the cardiology department. An object is a folder with
heart patients Medical Records). The administrator or object owner generates an
access control policy to rule the set of permitted operations (e.g., all junior nurses in
the Cardiology Department can access the heart patient's Medical Records). The
attributes and their values can all be modified through the subject lifecycle, objects,
and attributes without changing the each subject/object relationship. This property
affords a more flexible access control capability to ABAC model.
Another access control model proposed for outsourced data in the cloud is
Attribute-Based Encryption (ABE) method. This method is classified into two types
16
as a Key-Policy Attribute-Based Encryption (KP-ABE) and CipherText-Policy
Attribute-Based Encryption (CP-ABE).
In the KP-ABE scheme, data files are related to a set of attributes and the
public key, private key pairs are created for each attribute. Each key is linked to an
access tree policy that shows which type of cipher text the key can decrypt whereas
the ciphertexts are labeled with the set of expressive attributes. Generally, the KP-
ABE scheme with re-encryption technique is used for creating cloud access control
method.
In the CP-ABE scheme, a user’s private key will be linked with a random
number of attributes. Whenever a data owner encrypts a data file, they state a related
access tree policy over attributes. A user will only be able to decrypt a cipher text if
that user’s attributes pass through the cipher text’s access tree policy. Even if the
storage server is an untrusted entity, the encrypted data can be kept confidential in
cloud outsourced data.
Considering all these data security aspects and risk related to it makes
consumers for concern about data security in the cloud. Especially the secured data
storage in the cloud is the most important aspect that the consumers need to worry
about it in the recent trend. Cloud storage Zip cloud, (n.d.); Amazon S3, (n.d.);
MyAsiaCloud, (n.d.); Google drive, (n.d.) offers a finite storage space with cost
effective manner for clients to store their sensitive data. However, it imposes the data
security challenges when the users or enterprises outsource their sensitive data to
third-party cloud servers.
As data owners move their data on untrusted cloud servers, it brings forth the
high demand and concern for data confidentiality Di Vimercati et al, (2007a). In
addition to data confidentiality and privacy breach, the untrusted servers could use the
data for their financial benefit and brings the huge amount of economic loss for the
owners. In December 2010, first major data breach happened at Microsoft and it
announced that data contained within its Business Productivity Online Suite (BPOS)
have been downloaded by unauthorized users. Another example is AT &T, Apple data
leak protection issues in cloud breaches 100,000 of email addresses of iPad user’s
public. Deltcheva (2010)
17
There are various research works developed to provide secure access control
mechanism to prevent cloud outsourced data from unauthorized users. A direct
method is to use cryptographic schemes onto highly secured and sensitive data and
reveal encryption keys only to authorized users. However, issuing and protecting the
encryption keys from the unauthorized users create another security issue. A number
of schemes Yu et al., (2010); Wan et al., (2012); Hota et al., (2011) has in recent
times been proposed to achieve flexible and fine-grained access control in the cloud.
Unfortunately, these schemes do not focus the data file deletion upon data owner
request to revoke data file access from the cloud consumer. Cloud storage provider
may not totally expunge all backup file copies from its storage servers and it may
reveal the data to malicious users if encryption keys are obtained by malicious attacks.
The motivation of this research work carried out to resolve this issue and incorporated
a feature to ensure file assured deletion with a highly secured dynamic and scalable
access control schemes.
18
devices. The security and performance analysis and its experimental results are
compared with the existing scheme are detailed in this chapter.
Chapter 5 elaborates on proposed access control mechanisms based on
attribute-based access control method called as CB-HPAC: Cluster-based Hierarchical
Privacy preserving access control techniques in clouds is discussed. The security and
performance analysis of this scheme is presented in this chapter. The experimental
analysis of this scheme is compared with the existing access control models and
results are shown in this chapter. The experimental results of our suggested scheme
confirmed to provide dynamic, efficient and scalable access control scheme for cloud
outsourced data.
Chapter 7 includes the conclusion part of the presented scheme with the
consolidation of the results of all these schemes are discussed and proved that our
schemes are efficient in terms of less computation, communication cost and in better
security perspective.
SUMMARY
A systematic overview of the basics of cloud computing and its service
delivery models, deployment models, and its security threats with respect to host
level, network level, data level and various access control models and motivation for
the research work has been presented in this chapter.
19