International Journal of P2P Network Trends and Technology (IJPTT) – Volume 8 – May 2014
Secure Access of Enterprise Data from Third Party Cloud Rashmi Zilpelwar Department of Computer Engg., MITCOE-Pune University, India Abstract— Cloud computing is an emerging computing paradigm in which resources of the computing infrastructure are provided as services over the Internet. This paradigm also brings out many new challenges for data security and access control when users outsource their sensitive data on cloud servers for sharing, which are not within the same trusted domain as data owners. Existing solutions suffer from heavy computational overhead on the data owner as well as the cloud service provider for key distribution and management. The paper discusses this challenging open problem using various techniques that ensures only valid users will access the outsourced data. I. INTRODUCTION Cloud Computing provides, on demand and convenient network access of a computing resources [7]. In Cloud Computing, computing resources and hosted services are delivered over the Internet. Cloud computing is a type of distributed system which consists of a collection of virtualized computers which are interconnected and that are dynamically provisioned and presented as one or more unified computing resources based on service-level agreements established through negotiation between the service provider and consumers[9]. Cloud computing is an promising computing concept in which the resources of computing infrastructure are provided as services over the internet. As promising as it is, this also brings forth many new challenges for data security and access control when users outsource their sensitive data for sharing on cloud servers. In healthcare application scenarios use and disclosure of protected health information (PHI) should meet the requirements of Health Insurance Portability and Accountability Act (HIPAA) and keeping user data confidential against the storage servers is not just an option, but a requirement. Cloud Computing is an initiative proposed and taken up by big organizations such as IBM, Dell, Oracle, Google and Amazon. They are in strong positions with respect to cloud provisions. [8] Different service-oriented cloud computing models are available. Various commercial cloud computing systems are built at different levels like Amazon’s EC2, Amazon’s S3, and IBM’s Blue Cloud is an example of IaaS systems, where as Google App Engine and Yahoo Pig are envoy PaaS systems and Google’s Apps and Salesforce’s Customer Relation Management (CRM) system belong to SaaS systems. With the help of cloud computing systems,
ISSN: 2249-2615
enterprise users are no longer need to invest in hardware/software systems or hire some IT professionals to maintain the sensitive data, thus cost on IT infrastructure and human resources is saved[10]. On the other hand, computing utilities provided by cloud computing are being offered at a relatively low price in a pay-as-you-use style. The Remainder of this paper is prepared as follows. Section 2 discusses the related work of the different methods for securing data on cloud. Section 3 draws some conclusions. II. RELATED WORK A few research efforts have directly undertake the issues of access control in cloud computing model. Sanka et al [1] proposed the scheme, in order to address the security and access control problems in which data owner as well as the cloud service provider suffer from heavy computational overhead for key distribution and management, it used capability based access control technique that ensures only valid users will access the outsourced data. It also proposes a modified Diffie-Hellman key exchange protocol between cloud service provider and the user for secretly sharing a symmetric key for secure data access that look up the problem of key distribution and management at cloud service provider. It assumes that the system is composed of a Data Owner (DO), many Data consumers called as Users, and a Cloud Service Provider (CSP). The authentic users get the data file that is stored on the CSP by the DO in a confidential manner, neither the DO nor the User be always online. DO comes online when a new user is to be registered or when the capability list is to be updated at CSP. The data owner computes a message digest using MD5 for every file belonging to the data set available with it. This ensures data confidentiality and integrity between data owner and the user. DO then updates the capability list for every user with a new entry and the entire data item that can be accessed by the user. DO then send everything encrypted using its private key first and then using public key of the CSP for the purpose of authentication and confidentiality between CSP and DO. When a new user is to be added, it needs to send a registration request with UID, FID, Nonce, Timestamp and access rights required for the data file to the data owner. After receiving a
http://www.ijpttjournal.org
Page 1
International Journal of P2P Network Trends and Technology (IJPTT) – Volume 8 – May 2014 request, data owner adds an entry into the capability list if it is a valid request. DO now sends the capability list and an encrypted message intended for user with all the key parameters needed at user for decrypting the data files to CSP. CSP now updates it capability list and sends a registration reply to user using over encryption i.e. encrypting twice. After the keys are made available to the user, now that the actual data access request goes from a user to the CSP. If request is valid, diffie hellman is initiated by CSP. The user upon receiving an encrypted response from the CSP can decrypt the message and calculates the digest by using the hash function .The newly calculated digest is then compared with the digest that is attached with the message to check the integrity of the message. Yu et al [2] proposed a scheme, in order to achieve fine-grained, secure, and scalable access control in cloud computing they combined techniques of attribute based encryption (ABE), proxy re-encryption. KP-ABE is a public key cryptography primitive for one-to-many communications. For the message the encryptor associates the set of attributes by encrypting the attributes with the corresponding public key components. For each user an access structure is assigned, which is defined as an access tree over data attributes. The secret key of the user is defined in such a way that it should reflect the access structure so that the user is must be able to decrypt a cipher text but ,if and only if the data attributes of the message satisfy his access structure. Proxy Re-Encryption (PRE) is also a cryptographic primitive but in which a semi-trusted proxy is able to convert a cipher text encrypted under Alice’s public key into another cipher text that can be opened by Bob’s private key without seeing the underlying plaintext. In their proposed scheme, they use the technique of hybrid encryption to protect data files, i.e., encrypting data files using symmetric DEK and encrypt DEKs with KP-ABE. Using KP-ABE, fine-grained data access control and efficient operations such as file creation/deletion and new user grant are achieved. To resolve the challenging issue of user revocation, they combined the technique of proxy re-encryption with KP-ABE and delegate most of the burdensome computational task to Cloud Servers. They achieved this by keeping a partial copy of each user’s secret key at Cloud Servers. For the purpose of user revocation the data owner redefines a certain set of attributes, he also generates corresponding proxy reencryption keys and sends them to Cloud Servers. Those Cloud Servers which are having these proxy re-encryption keys can be able to update user secret key components and re-encrypt data files accordingly without knowing the
ISSN: 2249-2615
underlying plaintexts of data files. This enhancement releases the data owner from the possible huge computation overhead on user revocation. The data owner also does not need to always stay online since Cloud Servers will take over the burdensome task after having obtained the PRE keys. Prasad et al.[3] proposed a scheme in which the focus is on the problem of data leakage and it proposes a framework which works in two phases. The first phase is known as Data classification which is done by the client before storing the data. On the basis of CIA (Confidentiality, Integrity, and Availability) the data is need to be categorized in this phase. The value of CIA needs to be given to those clients who want to send the data for storage. At each junction of data processing and for preventing an unauthorized disclosure, the value of C is based on level of secrecy, value of I is based on how much assurance of accuracy is provided and value of A is based on how frequently it is accessible. The priority rating is calculated, data having the higher rating is considered to be critical and 3D security is recommended on that data. It uses the concept of protection rings, in this the data which is very sensitive is kept in protection ring 1 and needs strong authentication, if the data belong to ring 3 then it is public need not require any authentication, in which if a user (either employee or anonymous) want to access the data if it belongs to protection ring 2 then user have to register itself. Now suppose the user registered itself for accessing data, organization will provide username and password for authentication. Organization sends the username to cloud provider at the same time. Now the user sends password for authentication and after authentication the request is redirected to cloud provider to access resource. The 3D technique is used for accessibility by cloud providers which receive the data, after completion of first phase. The data which is proved to be sensitive will send for storage to cloud provider. That means in order to avoid impersonation and data leakage, the user who wants to access the data need to be authenticated. Kumar et al.[4] proposed a scheme in order to achieve secure, storage and access on outsource data in the cloud they used two sections private data section and shared data section. These two part of the cloud storage server makes the sharing of data easy and secure. Here data is encrypted using the elliptic curve cryptography approach. But here as all the data encrypted by same secrete key so if secrete key is compromised all the data is compromised. In the scheme they proposed to use shared pin which is weak security model. Rewagad et al.[5] proposed a scheme in which a combination of authentication technique and key exchange
http://www.ijpttjournal.org
Page 2
International Journal of P2P Network Trends and Technology (IJPTT) – Volume 8 – May 2014 algorithm blended with an encryption algorithm is used. This combination is referred to as “Three way mechanism” because it ensures all the three protection scheme of authentication, data security and verification, at the same time. They used digital signature and Diffie Hellman key
exchange blended with (AES) Advanced Encryption Standard encryption algorithm to protect confidentiality of data stored in cloud. But here the Encrypted keys are stored on trusted server and trusted server is on cloud. If trusted server is compromised, data is also compromised. TABLE I
COMPARATIVE STUDY ON SECURING DATA ON CLOUD
Sr. No. 1
Name
Year
Secure data access in cloud computing[1]
2010
Achieving Secure, Scalable, and Finegrained Data Access Control in Cloud Computing[2]
2010
3
3 Dimensional Security in Cloud Computing[3]
2011
4
Secure Storage and Access of Data in Cloud Computing[4]
2012
2
Advantages
Data owner need not be online always. It uses capability based access control technique that ensures only valid users will access the outsourced data.
Complexity is little bit more.
It achieves fine-grained, secure, and scalable access control in cloud. It uses hybrid encryption to protect data files.
Deriving a unique logical expression for every user using the attributes of every file is computationally complex.
Data access control is achieved by using protection rings, this ensures only authorised user will access the data.
Computational overhead on data owner, as it needs to give the values of CIA for every file before storing on cloud.
In order to achieve secure, storage and access on outsource data in the cloud they used two sections private data section and shared data section. These two part of the cloud storage server makes the sharing of data easy and secure. Data is encrypted using the elliptic curve cryptography approach.
As all the data encrypted by same secrete key so if secrete key is compromised all the data is compromised. In the scheme they proposed to use shared pin which is weak security model.
A combination of authentication technique and key exchange algorithm blended with an encryption algorithm is used. This combination is referred to as “Three way mechanism” because it ensures all the three protection scheme of authentication, data security and verification, at the same time. They used digital signature and Diffie Hellman key exchange blended with (AES) Advanced Encryption Standard encryption algorithm to protect confidentiality of data stored in cloud.
5
Use of Digital Signature with Diffie Hellman Key Exchange and AES Algorithm to Enhance Data Security in Cloud Computing[5]
2013
ISSN: 2249-2615
Disadvantages
Encrypted keys are stored on trusted server. Trusted server is on cloud. If trusted server is compromised, data is also compromised.
http://www.ijpttjournal.org
Page 3
International Journal of P2P Network Trends and Technology (IJPTT) – Volume 8 – May 2014 III. CONCLUSIONS This paper discusses various approaches of access control and cryptography that are used to protect the outsourced data on cloud. The data files are needed to be visible to the authorized user and not to the cloud service provider. Capability based access control technique can be used in order to ensures that only valid users will have access to the outsourced data. REFERENCES [1] S. Sanka, C. Hota, M. Rajarajan, “ Secure data access in cloud computing,” IEEE International conference, 2010.
[2] S. Yu, C. Wang, K. Ren, and W. Lou, “Achieving Secure, [3] [4] [5]
[6] [7] [8] [9] [10]
Scalable, and Fine-grained Data Access Control in Cloud Computing,” in Proc. of IEEE INFOCOM 2010, 2010. P. Prasad, B. Ojha, R.Shahi, R. Lal, A. Vaish, U. Goel, “3Dimensional Security in Cloud Computing” IEEE International conference, 2011. A. Kumar, B. Lee, A. Kumari, “Secure Storage and Access of Data in Cloud Computing”, IEEE ICTC , pp. 336-339, 2012. P. Rewagad, Y. Pawar, “Use of Digital Signature with DiffieHellman Key Exchange and AES Algorithm to Enhance Data Security in Cloud Computing”, IEEE International Conference on Communication Systems and Network Technologies, pp. 437-439, 2013. W. Wang, Z. Li, R. Owens, and B. Bhargava, “Secure and efficient access to outsourced data,” in Proc. of ACM Cloud Computing Security Workshop, pp. 55-65, 2009. Peter Mell, and Tim Grance, “Draft NIST Working Definition of Cloud Computing,” 2009, from http://csrc.nist.gov/groups/SNS/cloud-computing/ David Chappell, “Introducing the Azure Service Platform,” White paper, Oct 2008. Amazon EC2 and S3, Online at http://aws.amazon.com/ Google App Engine, Online at http://code.google.com/appengine/ R. Buyya, C. S. Yeo, and S. Venugopal, “Market oriented cloud computing: vision, hype, and reality, for delivering IT services as computing utilities,” in Proc. of 10th IEEE International Conference on High Performance Computing and Communications, 2008.
ISSN: 2249-2615
http://www.ijpttjournal.org
Page 4